Posted on: 16/12/2025
Description :
Job title : Data Engineer
Location : Bangalore
Experience : 6 to 12 Years
Role Overview :
We are looking for an experienced Data Engineer to design, build, and maintain scalable, reliable data pipelines and platforms on Google Cloud Platform (GCP).
The ideal candidate will have strong hands-on experience with modern data engineering tools and will work closely with analytics, data science, and business teams.
Key Responsibilities :
Data Pipeline & Platform Development :
- Design, develop, and maintain end-to-end data pipelines for batch and streaming data.
- Build scalable data processing solutions using BigQuery, Dataflow, Dataproc, or Datafusion.
- Ensure data quality, reliability, and performance across data workflows.
Workflow Orchestration :
- Develop and manage workflows using Cloud Composer (Apache Airflow) or Airflow.
- Schedule, monitor, and troubleshoot data pipelines.
- Implement retry, alerting, and failure-handling mechanisms.
Data Transformation & Modeling :
- Implement data transformations using DBT following analytics engineering best practices.
- Design and maintain data models optimized for analytics and reporting.
- Write optimized SQL queries for data analysis and reporting.
Programming & Processing :
- Develop data processing logic using Python or PySpark.
- Optimize distributed processing jobs for performance and cost efficiency.
- Handle large-scale datasets efficiently.
Cloud & Containerization :
- Build and manage data workloads on Google Kubernetes Engine (GKE).
- Deploy, scale, and monitor containerized data services.
- Work with cloud-native GCP services for data ingestion and storage.
Analytics & BI Integration :
- Support analytics and reporting use cases using Looker.
- Collaborate with BI teams to ensure data availability and semantic modeling.
- Enable self-service analytics through clean and reliable datasets.
Monitoring, Debugging & Optimization :
- Monitor data pipelines and infrastructure health.
- Troubleshoot production issues and optimize performance.
- Ensure cost optimization across GCP data services.
Required Skills & Qualifications :
- 6 to 12 years of experience in Data Engineering
- Strong hands-on experience with BigQuery
- Experience with Cloud Composer / Apache Airflow
- Hands-on experience with Dataflow or Dataproc or Datafusion
- Proficiency in Python or PySpark
- Strong experience with DBT
- Advanced SQL skills
- Experience with GKE
- Hands-on experience with Looker
Nice to Have :
- Experience with streaming pipelines (Pub/Sub, Kafka)
- Knowledge of data governance and security
- GCP certifications
- Experience in agile data teams
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1591390
Interview Questions for you
View All