Posted on: 02/04/2026
Data Engineer - Databricks / Airflow / AWS Lakehouse
Work Mode : Hybrid
Experience Required : 4 - 7 Years
Job Locations : Pune, Bangalore, Chennai
Interview Mode : In-person (Face-to-Face)
Working Hours : 11:00 AM - 8:00 PM IST
Job Overview :
We are looking for a skilled and experienced Data Engineer to join our team. The ideal candidate should have strong expertise in Python, Databricks, and AWS-based Lakehouse architecture, along with hands-on experience in data pipeline development and orchestration.
Key Responsibilities :
- Design, build, and maintain scalable data pipelines using Python and Airflow
- Work extensively with Databricks for data processing and analytics
- Implement and manage AWS Lakehouse architecture
- Develop and optimize data models and storage solutions
- Work with Apache Iceberg for large-scale data management
- Ensure data quality, reliability, and performance
- Collaborate with cross-functional teams for data-driven solutions
Required Skills :
- Strong proficiency in Python
- Hands-on experience with Databricks
- Experience with Apache Airflow
- Good understanding of AWS Lakehouse architecture
- Knowledge of Amazon S3, AWS Glue
- Experience in data modeling and Lakehouse patterns
- Familiarity with Apache Iceberg
Preferred Skills :
- Knowledge of AWS IAM and KMS (encryption)
- Understanding of networking basics in AWS
- Experience with Databricks on AWS
- Familiarity with monitoring/observability tools like CloudWatch, Datadog
Additional Requirements :
- Candidates must be willing to attend Face-to-Face interviews
- PAN Card copy and recent passport-size photograph are mandatory during submission
- Only candidates open to the specified work mode and timings should apply
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1625570