Posted on: 28/11/2025
Description :
- We are seeking a highly skilled Data Engineer to design, develop and optimize scalable data pipelines and solutions on Google Cloud Platform (GCP).
- The ideal candidate will have strong expertise in BigQuery, SQL, Python/Java and hands-on experience in building robust data pipelines to support advanced analytics and business intelligence.
- Design, develop and maintain scalable data pipelines and ETL/ELT processes.
- Work extensively on GCP services including BigQuery, Cloud Storage, Dataflow, Pub/Sub and Composer.
- Optimize data models and queries for performance and cost efficiency in BigQuery.
- Collaborate with data analysts, data scientists and business stakeholders to understand data requirements.
- Ensure data quality, governance and security across all data pipelines and storage systems.
- Troubleshoot and resolve data pipeline issues in real time.
- Contribute to architecture design discussions and provide best practices for data engineering.
Key Responsibilities :
- Strong hands-on experience with Google Cloud Platform (GCP) particularly BigQuery.
- Proven expertise in building and maintaining data pipelines.
- Strong SQL skills for query optimization and large-scale data manipulation.
- Proficiency in Python or Java for developing scalable ETL/ELT solutions.
- Good understanding of data modeling partitioning and performance tuning.
- Experience with workflow orchestration tools e. Airflow/Cloud Composer is a plus.
- Familiarity with CI/CD version control Git and agile methodologies.
This salary range is an estimation made by beBee
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1582350
Interview Questions for you
View All