Posted on: 21/11/2025
Description :
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines on GCP.
- Lead and execute data migration and modernization initiatives.
- Optimize ETL/ELT processes, ensuring data accuracy and quality.
- Collaborate with analytics and engineering teams on data-driven solutions.
- Ensure compliance with data security and governance standards.
Mandatory Skills :
- Expertise with GCP tools : BigQuery, Cloud Storage, Cloud Functions/Run, Dataflow (Apache
Beam), Pub/Sub
- Strong SQL and data modeling skills
- Experience with ETL/ELT, CI/CD pipelines, and Git
- Solid understanding of data warehousing and data governance.
Preferred Skills :
- Familiarity with Docker, Kubernetes, and Terraform
- Exposure to AWS or Azure
- Knowledge of data quality/testing frameworks.
Key Responsibilities :
- Lead and execute data migration and modernization projects.
- Build, optimize, and manage ETL/ELT processes, ensuring high data quality and accuracy.
- Collaborate closely with analytics, data science, and engineering teams to deliver data-driven
solutions.
- Ensure compliance with data security, governance, and privacy standards.
- Monitor, troubleshoot, and enhance data workflows for performance and reliability.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1577988