Posted on: 18/02/2026
Description :
- Utilize Dataplex capabilities for metadata harvesting, data profiling, and data quality assessments
- Integrate Dataplex with enterprise platforms such as MDM, Tableau, and Power BI
- Develop and maintain robust CI/CD pipelines and implement Infrastructure as Code using tools such as Terraform
- Build and optimize data pipelines using GCP services including BigQuery, Google Cloud Storage, Analytics Hub, and IAM
- Develop automation scripts and tools using Python
- Implement data governance, data cataloging, and metadata management practices
- Write and execute unit tests and integration tests for data pipelines and automation scripts
- Collaborate with cross-functional and global teams to deliver high-quality data solutions
Required Skills and Experience :
- Strong hands-on experience with Google Cloud Dataplex and GCP ecosystem
- Proficiency in BigQuery, Google Cloud Storage, Analytics Hub, and IAM
- Experience in automation and scripting using Python
- Experience in DevOps/DataOps practices including CI/CD and Terraform
- Familiarity with BI tools such as Tableau and Power BI
- Experience with enterprise data platforms including BigQuery, Oracle, SQL Server, and Spanner
- Understanding of data governance, metadata management, and data cataloging concepts
- Strong analytical, problem-solving, communication, and collaboration skills
Preferred Qualifications :
- Experience implementing enterprise-level data governance frameworks
- Exposure to large-scale cloud data modernization programs
- Experience working in global or distributed delivery environments
What We Offer :
- Exposure to large-scale cloud, data, and AI transformation projects
- Dynamic, collaborative, and performance-driven work environment
- Continuous learning and professional growth opportunities within a high-growth organization
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1613611