Posted on: 24/09/2025
About the Role :
We are seeking a highly skilled Data Engineer to design, develop, and optimize scalable data pipelines and solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery, SQL, Python/Java, and hands-on experience in building robust data pipelines to support advanced analytics and business intelligence.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL/ELT processes.
- Work extensively on GCP services including BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Composer.
- Optimize data models and queries for performance and cost efficiency in BigQuery.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, governance, and security across all data pipelines and storage systems.
- Troubleshoot and resolve data pipeline issues in real time.
- Contribute to architecture design discussions and provide best practices for data engineering.
Required Skills :
- Strong hands-on experience with Google Cloud Platform (GCP), particularly BigQuery.
- Proven expertise in building and maintaining data pipelines.
- Strong SQL skills for query optimization and large-scale data manipulation.
- Proficiency in Python or Java for developing scalable ETL/ELT solutions.
- Good understanding of data modeling, partitioning, and performance tuning.
- Experience with workflow orchestration tools (e.g., Airflow/Cloud Composer) is a plus.
- Familiarity with CI/CD, version control (Git), and agile methodologies.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1551188
Interview Questions for you
View All