HamburgerMenu
hirist

Reveille Technologies - Senior GCP Data Engineer - Python Programming

Posted on: 06/10/2025

Job Description

Description : This is a highly specific and senior-level role. Here is a detailed job description for a Senior GCP Data Engineer.


Role : Senior GCP Data Engineer (Full-Time Employee)


Overview :


We are seeking an experienced and dedicated Senior GCP Data Engineer to play a lead role in designing, building, and optimizing our next-generation cloud data platform. The ideal candidate will have extensive hands-on experience with Google Cloud Platform (GCP) data services, possess deep expertise in big data processing using Spark, and be proficient in modern data warehousing and ETL/ELT practices. This is a full-time position located in Chennai.


Key Responsibilities :


- Data Architecture & Design : Lead the design and implementation of scalable, reliable, and high-performance data pipelines and data warehouse solutions on Google Cloud Platform (GCP).


- GCP Data Service Management : Architect and manage core GCP data services, including BigQuery for data warehousing, Dataproc for Spark workloads, and Cloud Storage for data lake solutions.


- ETL/ELT Development : Develop, optimize, and maintain complex data processing jobs using PySpark for large-scale data transformations.


- Workflow Orchestration : Design and manage automated data workflows using Cloud Composer (Apache Airflow) for scheduling, monitoring, and managing data pipelines.


- Coding & Scripting : Write high-quality, efficient, and well-documented code primarily in Python for data ingestion, transformation, and service automation.


- Data Modeling & Querying : Design optimized data models and write advanced, performant SQL queries within the BigQuery environment.


- Performance Tuning : Proactively identify and resolve data pipeline bottlenecks, optimizing resource usage and minimizing data latency and cost.


- Leadership & Mentorship : Provide technical guidance, perform code reviews, and mentor junior team members on best practices in GCP and data engineering.


Required Skills & Qualifications :


- Experience : 8 - 12 years of progressive experience in Data Engineering, with a significant focus on cloud-based data solutions.


- Cloud Expertise : Deep, hands-on experience and a strong understanding of the Google Cloud Platform (GCP) data ecosystem.


GCP Data Services : Expert-level proficiency in :


- BigQuery : Advanced data warehousing, partitioning, clustering, cost optimization.


- Dataproc : Cluster configuration, optimizing Spark jobs, and managing big data processing.


- Cloud Storage : Data lake design and management, security, and lifecycle policies.


- Cloud Composer (Airflow) : Designing, deploying, and managing complex DAGs (Directed Acyclic Graphs).


- Programming : Strong command of Python for scripting, application development, and data manipulation.


- Big Data Processing : Mandatory expert-level experience with PySpark for scalable data transformation and processing.


- Data Querying : Advanced proficiency in SQL for complex queries, data analysis, and optimization.


- Architecture : Proven ability to translate business requirements into scalable, reliable technical data architectures.


Logistics & Hiring Details :


- Position : Senior GCP Data Engineer (Full-Time Employee)


- Location : Chennai


- Experience : 8 - 12 Years


- Budget : Commensurate with experience and skill set.


- Notice Period : 0 - 20 Days (Serving Notice Period Candidates Only). We are looking for immediate contributors.


info-icon

Did you find something suspicious?