Posted on: 31/01/2026
Role Overview :
We are seeking an experienced GCP BigQuery Engineer with strong expertise in BigQuery, DBT, and Python to design, build, and support scalable data platforms on Google Cloud. The role involves end-to-end ownership of data pipelines, production support, and close collaboration with business and engineering stakeholders to deliver reliable, high-performance data solutions.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines using BigQuery, DBT, and Python
- Build, optimize, and manage ETL/ELT workflows on Google Cloud Platform (GCP)
- Leverage GCP services such as Cloud Composer, Dataflow, and Cloud Storage
- Provide production support for batch and streaming data workflows, ensuring :
a. High availability
b. Performance optimization
c. Data accuracy and integrity
d. Monitor pipelines and proactively troubleshoot production issues
- Collaborate with cross-functional teams to gather requirements and deliver robust, business-ready data solutions
- Automate operational processes and implement alerting & monitoring using Python and GCP monitoring tools
- Develop and enhance DBT models, ensuring :
a. Modular design
b. Strong test coverage
c. Clear documentation
- Ensure adherence to data governance, security, and privacy standards
Required Skills & Qualifications :
- 6+ years of experience in Data Engineering or related roles
- Strong hands-on experience with :
a. Google BigQuery
b. DBT
c. Python
- Solid understanding of ETL/ELT architecture and best practices
- Experience working with GCP services (Cloud Composer, Dataflow, Cloud Storage, etc.)
- Proven experience in production support for data pipelines
- Knowledge of batch and streaming data processing
- Strong problem-solving and debugging skills
- Good communication skills and ability to work with cross-functional teams
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1608304