Posted on: 15/12/2025
Description :
Role Overview
We are looking for an experienced GCP Data Engineer with strong hands-on expertise in building, managing, and optimizing large-scale data pipelines on Google Cloud Platform.
The ideal candidate should have solid experience with BigQuery, Dataproc, Dataflow, and end-to-end data engineering workflows.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines on GCP using BigQuery, Dataproc, Dataflow, Pub/Sub, and Cloud Storage.
- Build and optimize data ingestion, ETL/ELT processes, and distributed data processing workflows.
- Work with large datasets to ensure high performance, reliability, and data quality within data platforms.
- Implement data transformation logic using Python, SQL, PySpark or similar technologies.
- Optimize BigQuery queries, partitioning, clustering, and cost management.
- Collaborate with analytics, BI, and application teams to deliver data solutions aligned with business needs.
- Ensure security, governance, and compliance across the GCP data ecosystem.
Required Skills & Experience :
- 5-10 years of total experience with minimum 3+ years in GCP data engineering.
- Strong hands-on experience in BigQuery, Dataproc, Dataflow, Cloud Composer/Airflow.
- Strong SQL and Python skills; experience in Spark or PySpark preferred.
- Good understanding of data warehousing concepts, ETL/ELT architectures, and distributed data processing.
- Experience with CI/CD pipelines, Git, and DevOps practices in cloud environments.
- Knowledge of Pub/Sub, Cloud Storage, Cloud Functions will be an added advantage
Did you find something suspicious?
Posted by
Namrata Kamble
Assistant Manager - Technology Human Resource at ACME SERVICES PRIVATE LIMITED
Last Active: 17 Dec 2025
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1590555
Interview Questions for you
View All