Posted on: 05/03/2026
Description :
Experience : 5 to 10 Years
Location : Hyderabad
Work Mode : Hybrid (3 days WFO in a week)
Key Responsibilities :
- Design, develop, and maintain data pipelines and data processing workflows on GCP.
- Implement scalable solutions using BigQuery and Dataflow for large-scale data processing.
- Build and manage AI/ML pipelines using Vertex AI and Cloud AI Pipelines.
- Develop orchestration workflows for data and ML pipeline automation.
- Ensure reliability, scalability, and performance of cloud-based data platforms.
- Collaborate with data engineers, ML engineers, and product teams to deliver analytics-ready datasets and AI solutions.
- Monitor and optimize pipeline performance and cloud resource utilization.
Required Skills :
- Strong experience with Google Cloud Platform (GCP).
- Hands-on experience with BigQuery and Dataflow.
- Experience with Vertex AI and Cloud AI Pipelines.
- Experience building data pipelines and data processing workflows.
- Knowledge of orchestration and automation frameworks.
- Strong understanding of cloud data architecture and distributed data processing.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1617837