Posted on: 29/08/2025
About the Role :
- Define and implement data quality rules and checks across platforms such as Databricks, BigQuery, Informatica, and DBT.
- Develop profiling, validation, and monitoring frameworks to continuously measure data quality KPIs (completeness, accuracy, timeliness, uniqueness, etc.
- Collaborate with data stewards and business teams to formalize data ownership, glossaries, and stewardship workflows.
- Establish lineage and impact analysis across pipelines using cataloging tools (Informatica EDC/Unity Catalog/DataHub/etc.
- Integrate data quality checks into CI/CD pipelines to ensure pre-deployment validations.
- Work closely with platform, MDM, and security teams to ensure compliance with enterprise data governance standards and regulatory policies.
- Support MDM-related governance efforts by coordinating master data definitions, hierarchies, and golden records.
Must-Have Skills :
- Hands-on experience with: Informatica Data Quality (IDQ) or Axon/EDC Unity Catalog (Databricks) configuring access control, lineage, metadata DBT defining validation rules in transformation layer BigQuery and Databricks integrating quality checks or observability.
- Strong understanding of data stewardship principles, metadata management, and data lifecycle policies.
- Experience building data quality dashboards, alerts, and monitoring rules.
- Familiarity with data catalogs, business glossaries, and data domain modeling.
Nice-to-Have Skills :
- Experience with DataHub, Collibra, or open metadata frameworks.
- Exposure to MDM tools and workflows (golden record creation, survivorship rules, etc.
- Working knowledge of data privacy, access management, and sensitive data handling.
Preferred Certifications :
- Databricks Lakehouse Fundamentals / Unity Catalog Specialist.
- GCP Data Engineer or Professional Data Stewardship certification.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1537647
Interview Questions for you
View All