Posted on: 20/07/2025
Data Engineer
Location : Hyderabad / Pune (Preference)
Joining : Immediate or within 15 days
Experience Required : 3-5 Years
Are you a Python & PySpark expert with hands-on experience in GCP? Passionate about building scalable, high-performance data pipelines? Join our fast-paced team and be part of impactful projects in the cloud
data ecosystem.
Key Responsibilities :
- Develop and optimize ETL/ELT workflows for large-scale datasets
- Work extensively with Google Cloud Platform (GCP) services including BigQuery, Dataflow, and Cloud Functions
- Implement containerized solutions using Docker, and manage code through Git and CI/CD pipelines
- Collaborate with data scientists, analysts, and other engineers to deliver high-quality data solutions
- Monitor, troubleshoot, and improve the performance of data pipelines and workflows
Preferred Skills & Qualifications :
- Strong experience in ETL/ELT, data modeling, distributed computing, and performance tuning
- Hands-on expertise in GCP and its services
- Working knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer
- GCP certification is a plus
- Experience with Docker, CI/CD practices, and version control tools like Git
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1516225
Interview Questions for you
View All