Posted on: 09/09/2025
Job Summary :
We are immediately hiring a skilled and motivated GCP Cloud Data Engineer for one of our esteemed clients. The ideal candidate should have hands-on experience with Python, Apache Airflow, Google BigQuery, SQL, Apache Spark, and Pub/Sub. You will be responsible for designing, developing, and maintaining robust data pipelines and cloud-based data solutions on Google Cloud Platform (GCP).
Key Responsibilities :
- Design, build, and optimize scalable and efficient data pipelines using Python and Airflow.
- Develop and maintain ETL/ELT workflows to process data across various sources and sinks.
- Work extensively with Google BigQuery for data transformation and analysis.
- Integrate real-time and batch data processing solutions using Pub/Sub and Spark.
- Write complex and optimized SQL queries for large-scale datasets.
- Collaborate with data scientists, analysts, and platform teams to deliver end-to-end data solutions.
- Ensure data quality, performance, and governance across pipelines.
- Monitor, troubleshoot, and optimize data workflows in production environments.
Required Technical Skills :
Programming :
- Proficient in Python for data engineering tasks and workflow orchestration.
Orchestration Tools :
- Experience with Apache Airflow.
Cloud Platform :
Strong experience with Google Cloud Platform (GCP) services, particularly :
- BigQuery
- Pub/Sub
Data Processing :
- Hands-on experience with Apache Spark.
Database Skills :
- Advanced SQL skills for querying and managing large datasets.
Preferred Qualifications :
- GCP Data Engineer certification is a plus.
- Experience with CI/CD pipelines and DevOps practices in data engineering.
- Familiarity with other GCP services like Cloud Storage, Dataflow, or Dataproc.
- Strong problem-solving skills and ability to work in a fast-paced, agile environment.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1543371
Interview Questions for you
View All