Posted on: 14/02/2026
Description :
Job Summary :
We are looking for a skilled GCP Data Engineer to design, build, and optimize scalable data pipelines on Google Cloud Platform.
The ideal candidate will have strong experience in data ingestion, transformation, orchestration, and cloud-based data warehousing using GCP services.
Key Responsibilities :
- Design and develop batch and real-time data pipelines using Dataflow
- Build scalable data lakes using Google Cloud Storage
- Develop and optimize data models in BigQuery
- Implement streaming ingestion using Cloud Pub/Sub
- Manage CDC pipelines using Data stream
- Orchestrate workflows using Cloud Composer
- Implement governance using Dataplex and IAM policies
- Ensure performance tuning, cost optimization, monitoring, and reliability
- Collaborate with analytics, BI, and data science teams
Required Skills :
- Strong experience in GCP data services (BigQuery, Dataflow, GCS, Pub/Sub)
- Hands-on with SQL and Python
- Experience with ETL/ELT pipeline development
- Knowledge of data modeling (Star/Snowflake schema)
- Understanding of IAM, security, and best practices
- Experience with CI/CD and Terraform (preferred)
Good to Have :
- Experience with Spark (Dataproc)
- Knowledge of dbt
- Exposure to Looker / BI tools
- GCP Professional Data Engineer Certification
Did you find something suspicious?
Posted by
Ravish
Last Active: NA as recruiter has posted this job through third party tool.
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1612808