Posted on: 07/09/2025
Position : GCP_Data_Engineer
Description :
- Design, develop, and optimize batch & streaming pipelines using Dataflow (Apache Beam), Pub/Sub, Dataproc, Data Fusion.
- Build and manage data lakes & warehouses on Cloud Storage + BigQuery; implement partitioning, clustering, cost controls.
- Establish data governance with Dataplex and Data Catalog (lineage, classification, policies).
- Productionize with Cloud Build (CI/CD), Cloud Logging, Cloud Monitoring/Alerting, Error Reporting.
- Define data models, performance SLAs, and reliability (backfills, retries, DLQs, IaC).
- (Architect plus) Lead solution design, reference architectures, security patterns (IAM, VPC-SC, DLP).
Must-Have Skills :
- Deep GCP expertise : BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Data Fusion, Dataplex, Data Catalog.
- Strong Python and SQL; Beam/Spark proficiency.
- CI/CD with Cloud Build, Git; orchestration via Cloud Composer/Workflows.
- Observability & SRE practices on GCP : metrics, tracing, log-based alerts.
- Data modeling (lakehouse/warehouse), performance tuning, cost optimization.
- Security-by-design : IAM, service accounts, secrets, perimeter controls.
Nice to Have :
- Terraform, Cloud Functions, Looker/Looker Studio, BigQuery ML/Vertex AI integration.
- GCP Professional Data Engineer and/or Professional Cloud Architect certification.
- Client-facing solutioning and estimation experience.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1541993
Interview Questions for you
View All