HamburgerMenu
hirist

Bridgehorn - Technical Lead - Data Engineering

SMITHMAVEN TECHNOLOGIES PRIVATE LIMITED
3 - 6 Years
rupee10-18 LPA
Hyderabad

Posted on: 29/04/2026

Job Description

Job Summary :


As a Tech Lead Data Engineering, you will take ownership of designing, building, and scaling robust data platforms and pipelines. You will lead a small team of data engineers, collaborate closely with stakeholders, and ensure best practices in data architecture, quality, and performance.


Roles & Responsibilities :


- Lead the design and development of scalable data pipelines and architectures, ensuring best practices in coding, data modeling, and system design

- Provide technical guidance and mentorship to junior data engineers and conduct code and design reviews


- Design and implement ETL/ELT pipelines using Azure, AWS, or Snowflake ecosystems, including medallion (BronzeSilverGold) data lake architecture


- Optimize data pipelines for performance, scalability, and reliability, and manage data ingestion strategies (full and incremental loads)


- Work with tools such as Azure Data Factory, Databricks, Microsoft Fabric, AWS Glue, Redshift, or Snowflake to build and manage data platforms


- Design and optimize data models (star and snowflake schemas) and develop high-performance SQL queries


- Collaborate with analysts and business teams to deliver clean datasets and support dashboard development using Power BI or Tableau


- Translate business requirements into scalable technical solutions and work cross-functionally with data teams


- Define and enforce data standards, documentation, data quality, lineage, and governance practices


- Own end-to-end delivery of data engineering projects, ensuring timelines, quality, and scalability


Required Skills & Qualifications :


- 3 - 6 years of experience in data engineering or data platform development, with experience leading or mentoring small teams

- Strong proficiency in SQL and Python, with hands-on experience in building and optimizing data pipelines


- Experience with at least one ecosystem: Azure (Data Factory, Databricks, Microsoft Fabric, DevOps), AWS (Glue, Redshift, SageMaker or Bedrock), or Snowflake


- Strong understanding of data modeling, data warehousing concepts, ETL/ELT design, and medallion architecture


- Experience building scalable, reliable, and efficient data pipelines and modern data platforms


- Ability to analyze complex datasets, perform debugging, optimization, and performance tuning


- Strong communication skills with the ability to collaborate effectively with cross-functional teams and stakeholders


Good to Have :


- Experience with Power BI or Tableau

- Exposure to CI/CD pipelines and DevOps practices


- Experience working with real-time or streaming data pipelines


- Certifications in Azure, AWS, or Snowflake


Benefits :


- Competitive salary and performance-based bonuses

- Opportunities for certifications and professional growth

- Collaborative, innovation-driven work environment


- Direct exposure to decision-makers and client interactions

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in