Posted on: 25/11/2025
Key Responsibilities :
- Build, and maintain end-to-end GCP data pipelines (batch and streaming).
- Ensure data platform uptime and performance in alignment with defined SLAs/SLOs.
- Develop ETL/ELT workflows using Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery.
- Manage and enhance data lakes and warehouses using BigQuery and Cloud Storage.
- Implement streaming data solutions using Pub/Sub, Dataflow, or Kafka.
- Build data APIs and microservices for data consumption using Cloud Run, Cloud Functions, or App Engine.
- Define and enforce data quality, governance, and lineage using Data Catalog and Cloud Data Quality tools.
- Collaborate with DevOps to build CI/CD pipelines, infrastructure as code, and automated monitoring for data workflows.
- Participate in incident management, RCA, and change control processes following ITIL best practices.
- Mentor junior engineers and ensure adherence to engineering best practices.
Key Competencies :
- Experience in managed service delivery under strict SLAs (availability, latency, and resolution timelines).
- Strong understanding of GCP, IAM, and cost management.
- Strong knowledge of incident management, change management, and problem management using ITSM tools (e.g., ServiceNow, Jira).
- Ability to lead on-call operations in a 16/5 support model, coordinating across global teams.
- Understanding of data security, governance, and compliance.
- Excellent communication and client-handling skills.
- Ability to work in a 16/5 model, coordinating with global teams across time zones.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1579771