Posted on: 28/09/2025
Key Responsibilities :
- Design, develop, and maintain robust data pipelines and ETL/ELT workflows using PySpark, Python, and SQL.
- Build and manage data ingestion and transformation processes from various sources including Hive, Kafka, and cloud-native services.
- Orchestrate workflows using Apache Airflow and ensure timely and reliable data delivery.
- Work with large-scale big data systems to process structured and unstructured datasets.
- Implement data quality checks, monitoring, and alerting mechanisms.
- Collaborate with cross-functional teams including data scientists, analysts, and product managers to understand data requirements.
- Optimize data processing for performance, scalability, and cost-efficiency.
- Ensure compliance with data governance, security, and privacy standards.
Required Skills & Qualifications :
- 5+ years of experience in data engineering or related roles.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1553270
Interview Questions for you
View All