Posted on: 21/12/2025
Description :
Role Summary :
We are seeking a Lead Data Engineer to design, build, and own scalable data platforms and pipelines that support analytics, reporting, and data-driven decision-making.
This role requires strong technical expertise, architectural thinking, and leadership capability.
You will guide data engineering practices, mentor engineers, and collaborate closely with analytics, data science, and product teams.
Experience : 7 to 10+ years
Department : Data / Engineering
Location : On-site / Hybrid / Remote (as applicable)
Key Responsibilities :
- Design and own scalable, reliable, and secure data architectures for batch and real-time processing.
- Define data platform standards, best practices, and long-term technical roadmap.
- Evaluate and implement tools for ingestion, processing, storage, and analytics.
- Build and maintain ETL/ELT pipelines for structured and unstructured data.
- Ensure high availability, performance, and data quality across pipelines.
- Implement data validation, monitoring, alerting, and recovery mechanisms.
- Lead data solutions on cloud platforms such as AWS, Azure, or GCP.
- Design data lakes and warehouses using tools like S3, BigQuery, Redshift, Snowflake, Synapse.
- Optimize cost, performance, and scalability of data infrastructure.
- Design and manage real-time data processing using Kafka, Kinesis, Pub/Sub, or similar.
- Support event-driven architectures and streaming analytics use cases.
- Define and enforce data governance, access controls, and security best practices.
- Ensure compliance with data privacy and regulatory standards.
- Implement metadata management, lineage, and data cataloging.
- Lead and mentor data engineers; conduct code reviews and technical guidance.
- Collaborate with analytics, BI, data science, and product teams to understand data needs.
- Translate business requirements into scalable technical solutions.
- Improve data engineering processes, automation, and reliability.
- Drive adoption of modern data engineering practices and tools.
- Evaluate emerging technologies to enhance platform efficiency.
Required Skills & Experience :
- Strong experience with Python, SQL, and data engineering frameworks.
- Hands-on expertise with ETL/ELT tools (Airflow, dbt, Glue, Dataflow, Azure Data Factory).
- Deep understanding of data modeling, schema design, and performance tuning.
- Extensive experience with cloud data ecosystems (AWS / Azure / GCP).
- Proficiency with data warehouses and lakehouse architectures.
- Experience with containerization and orchestration (Docker, Kubernetes) is a plus
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1593334
Interview Questions for you
View All