HamburgerMenu
hirist

Data Engineer - Google Cloud Platform

AWIGN ENTERPRISES PRIVATE LIMITED
Multiple Locations
5 - 8 Years

Posted on: 23/11/2025

Job Description

Job Title : GCP Data Engineer


Location : Chennai, Bangalore, Hyderabad, Pune- Hybrid mode


Experience : 5+ years


Employment Type : Full-time


Shift : 2pm - 11pm IST


Role Summary :




We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Data Engineer to design, build, and optimize scalable data pipelines and analytics solutions on GCP. The ideal candidate will have deep expertise in cloud-native data engineering, strong programming skills, and a solid understanding of data architecture, ETL/ELT processes, and big data technologies. This role involves close collaboration with data scientists, analysts, and business stakeholders to deliver high-performance, secure, and reliable data solutions.


Key Responsibilities :




- Design and implement data pipelines using GCP services such as Dataflow, Dataproc, BigQuery, Pub/Sub, Cloud Storage, and Composer

- Develop and maintain ETL/ELT workflows for structured and unstructured data sources

- Optimize BigQuery SQL queries for performance and cost efficiency

- Build streaming data solutions using Apache Beam and Pub/Sub

- Implement data quality checks, validation rules, and monitoring mechanisms

- Collaborate with data architects to define data models, schemas, and partitioning strategies

- Integrate data from various sources including APIs, flat files, relational databases, and third-party platforms

- Automate workflows using Cloud Composer (Airflow) and CI/CD pipelines

- Ensure data security, governance, and compliance with organizational and regulatory standards

- Troubleshoot and resolve issues related to data ingestion, transformation, and availability

- Document technical designs, data flows, and operational procedures


Required Skills and Qualifications :




- Bachelors or Masters degree in Computer Science, Engineering, or related field

- 5+ years of experience in data engineering.

- Strong proficiency in Python, Pyspark, SQL, and optionally Java or Scala

- Hands-on experience with BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage

- Experience with Apache Beam, Spark, or Hadoop for large-scale data processing

- Familiarity with Cloud Composer (Airflow) for orchestration

- Knowledge of data warehousing, data lakes, and dimensional modeling

- Experience with CI/CD tools like Jenkins, GitHub Actions, or Cloud Build

- Excellent problem-solving, communication, and stakeholder management skills


Preferred Qualifications :

- GCP Professional Data Engineer Certification

- Experience with machine learning pipelines or data science workflows

- Familiarity with Looker, Tableau, or Power BI for data visualization

- Exposure to Kafka, Snowflake, or other cloud platforms (AWS, Azure)

- Experience in healthcare, finance, or retail domains is a plus


info-icon

Did you find something suspicious?