HamburgerMenu
hirist

Big Data Engineer - Google Cloud Platform

SP Staffing Services
6 - 20 Years
Multiple Locations

Posted on: 16/03/2026

Job Description

Relevant Experience : 5+ to11 Yrs


Location : Chennai / Bangalore / Gurgaon

Job Description :

Role Summary :


We are looking for a GCP Big Data Engineer with strong hands-on experience in SQL, PySpark, and Google Cloud Platform to design, build, and optimize large-scale data pipelines. The role requires end-to-end ownership of data pipelines and strong CI-readiness.

Mandatory Skills :


- SQL (advanced joins, window functions, optimization)

- Google Cloud Platform (GCP)

- BigQuery (partitioning, clustering, cost optimization)

- Python

- Dataproc (PySpark / Spark)

- Cloud Storage (GCS)

- Cloud Composer (Airflow)

Additional GCP Services :


- Pub/Sub

- Cloud Functions / Cloud Run

- Dataflow (Apache Beam)

DevOps / Tools :


- Git / GitHub

- CI/CD pipelines

- Linux / Shell scripting

Responsibilities :


- Develop Python and PySpark-based ETL/ELT pipelines

- Build and optimize Dataproc Spark jobs

- Optimize BigQuery SQL and manage costs

- Orchestrate workflows using Cloud Composer

- Implement data validation, monitoring, and error handling

- Handle incremental and batch data processing

- Perform Hadoop to GCP migration

- Metadata management and performance tuning


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in