Posted on: 09/09/2025
Job role: Data Engineer
Location : Hyderabad / Pune (Preference)
Exp: 3-5 years
Joining : Immediate or within 15 Required : 3-5 Years
Data Responsibilities :
- Design, build, and maintain scalable and efficient data pipelines using Python and PySpark
- Develop and optimize ETL/ELT workflows for large-scale datasets
- Work extensively with Google Cloud Platform (GCP) services including BigQuery, Dataflow, and Cloud Functions
- Implement containerized solutions using Docker, and manage code through Git and CI/CD pipelines
- Collaborate with data scientists, analysts, and other engineers to deliver high-quality data solutions
- Monitor, troubleshoot, and improve the performance of data pipelines and
Skills & Qualifications :
- Strong experience in ETL/ELT, data modeling, distributed computing, and performance tuning
- Hands-on expertise in GCP and its services
- Working knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer
- GCP certification is a plus
- Experience with Docker, CI/CD practices, and version control tools like Git
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1543570
Interview Questions for you
View All