Posted on: 20/01/2026
Description :
Key Responsibilities :
- Design, develop, and maintain batch and real-time data pipelines using GCP services.
- Develop and optimize data models and queries using BigQuery.
- Build streaming and batch pipelines using Dataflow, Pub/Sub, and Apache Beam.
- Orchestrate and schedule workflows using Cloud Composer (Airflow).
- Implement data governance, security, and observability using Dataplex and related tools.
- Automate deployments using CI/CD pipelines and Infrastructure as Code with GitHub Actions.
- Integrate data platforms with AI/ML workflows using Vertex AI.
- Perform performance tuning and cost optimization of GCP resources.
- Collaborate with cross-functional teams to deliver reliable and scalable data solutions.
Required Skills :
- Strong hands-on experience with Google Cloud Platform services such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Dataplex.
- Strong SQL skills for data analysis, transformation, and optimization.
- Proficiency in Python for data engineering and pipeline development.
- Experience with Apache Beam for building batch and streaming data pipelines.
- Hands-on experience with CI/CD pipelines and Infrastructure as Code using GitHub Actions.
- Familiarity with data governance, security, and observability tools and best practices.
- Experience integrating data pipelines with AI/ML platforms using Vertex AI.
- Strong understanding of performance tuning and cost optimization in GCP environments.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1603687