Posted on: 25/11/2025
We are looking for an experienced GCP Data Engineer with strong hands-on expertise in building scalable data pipelines, data models, and cloud-based data solutions. The ideal candidate should be highly proficient in Python, SQL, BigQuery, and Airflow, with a deep understanding of Google Cloud Platform (GCP) services. This role involves designing and implementing data workflows, optimizing infrastructure, and collaborating with cross-functional teams to enable reliable and efficient data processing across the organization.
Key Responsibilities :
- Design, build, and maintain scalable and reliable data pipelines using Python, SQL, BigQuery, Airflow, and GCP-native services.
- Develop ETL/ELT processes to ingest, transform, and load structured and unstructured data.
- Optimize and automate workflows using Cloud Composer (Airflow) and Cloud Functions.
- Build data models, data marts, and warehouse solutions using BigQuery.
- Implement best practices for schema design, partitioning, clustering, and performance tuning.
- Ensure data architecture aligns with business requirements and analytical needs.
- Build and manage solutions using GCP components such as Cloud Storage, Pub/Sub, Dataflow, Cloud Functions, and Cloud Composer.
- Deploy, orchestrate, and monitor data workflows within the GCP ecosystem.
- Implement secure, scalable, and cost-efficient cloud architectures.
- Troubleshoot and debug data processing issues, ensuring high reliability and data accuracy.
- Monitor pipeline performance and optimize resource utilization.
- Perform root-cause analysis for pipeline failures and implement long-term fixes.
- Work closely with data analysts, data scientists, product teams, and business stakeholders to understand data needs.
- Translate requirements into technical specifications and deliver actionable data solutions.
- Communicate complex technical concepts clearly to non-technical teams.
- Maintain clear documentation of workflows, architecture, and system components.
- Contribute to improving engineering standards, processes, and best practices.
- Support CI/CD implementation for data pipelines and deployments.
Required Skills & Expertise :
- Strong experience with Python, SQL, and BigQuery.
- Hands-on experience with Airflow (or Cloud Composer).
- Solid expertise in GCP servicesBigQuery, Cloud Storage, Pub/Sub, Cloud Functions, Cloud Composer.
- Strong understanding of data modeling, data warehousing, and ETL/ELT pipelines.
- Excellent problem-solving and data debugging skills.
- Experience with Dataflow (Python or Java).
- Knowledge of Docker, Shell scripting, and CI/CD pipelines.
- Familiarity with Looker Studio (Data Studio) or similar visualization tools.
- Experience with event-driven architectures, streaming pipelines, or microservices.
- Strong communication skills for effective cross-functional collaboration.
- Ability to manage multiple tasks and deliver high-quality solutions under deadlines.
- Detail-oriented, analytical, and proactive.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1579814
Interview Questions for you
View All