Posted on: 15/09/2025
Responsibilities :
- Design, build, and maintain data pipelines and ETL workflows on Google Cloud Platform.
- Work with BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage to enable scalable data solutions.
- Develop and optimize data models, transformations, and analytics layers.
- Write efficient Python/SQL scripts for data processing and automation.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Implement best practices for performance, scalability, security, and cost optimization.
- Support data migration and integration from traditional databases to BigQuery.
- Work with CI/CD pipelines, Docker, and Kubernetes for deployment.
Requirements :
- 5+ years of hands-on experience in Data Engineering
- Strong expertise in Google Cloud Platform (GCP) services BigQuery, Dataflow, Pub/Sub, Dataproc
- Proficiency in Python and SQL for large-scale data processing
- Solid experience with ETL pipelines, data modeling, and performance tuning
- Familiarity with Airflow/Cloud Composer for orchestration
- Experience with Docker, Kubernetes, Terraform, CI/CD is a plus
- Excellent problem-solving and communication skills
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1545743
Interview Questions for you
View All