Posted on: 16/04/2026
Job Description :
Looking for an ETL Developer with hands-on experience in Airflow and ETL development to build and maintain scalable data pipelines. The candidate will be responsible for developing data workflows, managing scheduling processes, and ensuring smooth data integration across systems.
Key Responsibilities :
- Design, develop, and maintain ETL pipelines for data integration and processing.
- Build and manage workflows using Apache Airflow.
- Perform data extraction, transformation, and loading from multiple data sources.
- Monitor, troubleshoot, and optimize ETL jobs and workflows.
- Ensure data accuracy, consistency, and reliability across systems.
- Collaborate with data engineers and business teams for data requirements.
Required Skills :
- 2-3 years of experience in ETL development.
- Hands-on experience with Apache Airflow.
- Strong knowledge of Python and SQL.
- Experience in building and scheduling data pipelines.
- Understanding of data warehousing and data integration concepts.
Preferred Skills :
- Exposure to big data technologies such as Apache Spark.
- Experience with cloud platforms like Amazon Web Services, Microsoft Azure, or Google Cloud.
- Good analytical and problem-solving skills.
Notice Period :
Immediate joiners or candidates with a short notice period preferred.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1628824