HamburgerMenu
hirist

Data Engineer - Google Cloud Platform

Diensten Tech Limited
4 - 7 Years
Multiple Locations

Posted on: 10/03/2026

Job Description

Job Summary :

We are looking for an experienced GCP Data Engineer with strong expertise in Google Cloud Platform (GCP) data services, Python, and PySpark. The ideal candidate will be responsible for building scalable data pipelines, managing large datasets, and optimizing data workflows using modern cloud-based data engineering tools.

Key Responsibilities :

- Design, develop, and maintain scalable data pipelines and ETL processes on Google Cloud Platform.

- Work with GCP services such as BigQuery, Dataflow, Dataproc, Dataplex, Data Fusion, Cloud SQL, Airflow, and Cloud Storage.

- Develop and maintain data ingestion and transformation workflows using Python and PySpark.

- Build and manage data orchestration workflows using Apache Airflow.

- Use Terraform and Tekton for infrastructure automation and CI/CD workflows.

- Implement data transfer utilities and batch data processing jobs.

- Write shell scripts to develop ad-hoc jobs for data import/export processes.

- Collaborate with cross-functional teams in an Agile development environment.

- Manage and maintain code using Git or other version control systems.

- Integrate Kafka-based streaming pipelines and API-based data integrations when required.

Required Skills :

- Strong experience with Google Cloud Platform (GCP).

- Hands-on experience with the following GCP services :

1. BigQuery

2. Dataflow

3. Dataproc

4. Data Fusion

5. Dataplex

6. Cloud SQL

7. Cloud Storage

8. Airflow

- Experience with Terraform and Tekton for infrastructure and deployment automation.

- Strong programming experience in Python and PySpark.

- Experience with PostgreSQL or other relational databases.

- Experience working with Data Transfer Utilities.

- Experience with Git or other version control systems.

- Knowledge of API development and integration.

- Strong experience writing shell scripts for automation and data processing jobs.

- Experience working in Agile frameworks.

Preferred Skills :

- Experience with Apache Kafka or Confluent Kafka.

- Exposure to Big Data technologies and distributed data processing systems.

Education :

- Bachelor's Degree in Computer Science, Information Technology, or related field.

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in