Posted on: 25/12/2025
Description :
Exp : 4 - 5 yrs
Edu : Any Graduates
Work Location : Mumbai, Airoli WFO
Notice Period : Immediate - 15 days
Skills : PySpark ,Airflow SQL, DBT
Job Summary :
The ideal candidate will have strong expertise in PySpark, Airflow, SQL, and DBT, and will work closely with analytics and business teams to enable reliable, high-quality data delivery.
Key Responsibilities :
- Orchestrate and schedule workflows using Apache Airflow.
- Develop and optimize complex SQL queries for data transformation and analysis.
- Implement data transformation and modeling using DBT.
- Ensure data quality, consistency, and reliability across data platforms.
- Collaborate with data analysts, data scientists, and stakeholders to understand data requirements.
- Monitor and troubleshoot data pipeline failures and performance issues.
- Maintain documentation for data workflows and processes.
Required Skills & Qualifications :
- Strong hands-on experience with PySpark.
- Experience working with Apache Airflow for workflow orchestration.
- Advanced knowledge of SQL and data modeling concepts.
- Practical experience with DBT for data transformations.
- Understanding of data warehousing and ETL/ELT processes.
- Bachelors degree
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1594591
Interview Questions for you
View All