Posted on: 14/04/2026
Job Title : Lead Data Engineer
Location : Any
Experience : 7 to 10 Years
Notice : Immediate or Serving Notice (15 Days)
Role Summary :
We are hiring a Senior Data Platform Engineer to build and scale modern data infrastructure, data pipelines, and cloud-based data platforms supporting analytics, business intelligence, and machine learning systems.
The ideal candidate has strong experience in ETL/ELT development, distributed data systems, real-time streaming, and cloud data platforms (AWS/GCP/Azure), along with expertise in data governance, data modeling, and data observability.
Core Skills :
- Data Engineering
- Data Platform
- ETL, ELT, Data Pipelines
- Big Data
- Distributed Systems
- Data Architecture
- Data Modeling
- SQL, Python, (AWS, GCP, Azure)
- Snowflake
- BigQuery
- Redshift, Apache Airflow
- Dagster, Prefect
- Apache Kafka
- Apache Flink, AWS Kinesis, dbt
- Terraform
- Data Governance
- Data Quality
- Data Observability
- Machine Learning Pipelines
- Data Mesh
- PII Compliance, SOX Compliance
Key Responsibilities :
- Build and maintain scalable ETL/ELT pipelines for batch and real-time data processing
- Design and optimize data platform architecture and data infrastructure
- Develop real-time streaming pipelines using Kafka / Flink / Kinesis
- Implement data modeling and transformation frameworks (dbt, SQL)
- Ensure data reliability, scalability, and performance optimization
- Establish data governance, data lineage, access control, and compliance frameworks
- Build data observability systems (data quality, schema validation, monitoring dashboards)
- Collaborate with data science, analytics, product, and engineering teams
- Support machine learning pipelines and feature engineering workflows
- Mentor engineers and drive best practices in data engineering
Required Qualifications :
- 7+ years of experience in Data Engineering / Data Platform Engineering
- Strong expertise in Python and SQL
- Experience with cloud platforms (AWS)
- Hands-on experience with data warehouses (Snowflake / BigQuery / Redshift)
- Experience with workflow orchestration tools (Airflow / Dagster / Prefect)
- Knowledge of real-time data streaming technologies (Kafka / Flink / Kinesis)
- Experience with Infrastructure as Code (Terraform)
- Strong understanding of data modeling, distributed systems, and system design
Strong Signals (Good Candidates) :
- Experience with Kafka / streaming pipelines
- Built scalable distributed systems
- Hands-on with dbt / data modeling
- Worked on ML pipelines / feature stores
- Experience with Terraform / Infra as Code
- Designed data platform architecture end-to-end
- Implemented data governance & compliance frameworks
- Experience with Data Mesh / domain-driven design
- Led teams / mentored engineers
- Built self-service data platforms
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1628159