Posted on: 07/03/2026
Description :
We are looking for an experienced Data Engineer with strong expertise in AWS, DBT, Databricks, and Apache Airflow to join our growing data engineering team.
Immediate joiners preferred
Role Overview :
The ideal candidate will design, develop, and maintain scalable data pipelines and data platforms to support analytics and business intelligence initiatives.
Key Responsibilities :
- Design and build scalable data pipelines using AWS, Databricks, DBT, and Airflow.
- Develop and optimize ETL/ELT workflows for large-scale data processing.
- Implement data transformation models using DBT.
- Orchestrate workflows using Apache Airflow.
- Work with Databricks for big data processing and analytics.
- Ensure data quality, reliability, and performance optimization.
- Collaborate with data analysts, engineers, and business teams.
Required Skills :
- Strong experience with AWS data services
- Hands-on experience with Databricks
- Experience in DBT (Data Build Tool)
- Workflow orchestration using Apache Airflow
- Strong SQL and Python skills
- Experience in data warehousing and ETL pipelines
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1618662