HamburgerMenu
hirist

Senior Software Engineer - Data Quality

Kadel Labs
Bangalore
5 - 7 Years

Posted on: 23/01/2026

Job Description

Description :

Role Overview :

We are looking for a Senior Software Engineer Data Quality Assurance (SSE-DQA) to ensure high standards of data reliability, accuracy, and governance across enterprise data platforms. The role involves designing and automating data quality checks and embedding validation processes into ETL pipelines within a Databricks-based environment.

Key Roles and Responsibilities :

- Define and implement data quality and validation rules based on business and technical requirements

- Perform data profiling, data cleansing, and data transformation to ensure high-quality datasets

- Monitor and measure data quality metrics including completeness, consistency, uniqueness, and accuracy

- Design and automate data validation and data quality workflows

- Integrate data validation processes into ETL pipelines and modern data platforms

- Implement and support data profiling, data quality, and data governance frameworks

- Develop and maintain Python-based scripts for data validation and automation

- Work hands-on with Databricks notebooks and Databricks-based data environments

- Collaborate with data engineers, analytics teams, and stakeholders to identify and resolve data issues

Required Skills & Qualifications :

- 5 - 7 years of experience in Data Quality Assurance / Data QA / Data Validation

- Strong expertise in defining and implementing data quality and validation rules

- Hands-on experience with data profiling, data cleansing, and data transformation

- Solid understanding of data quality metrics (completeness, consistency, uniqueness, accuracy)

- Experience automating data validation and quality-check workflows

- Experience integrating validation processes into ETL pipelines

- Strong Python scripting or development experience

- Strong SQL skills

- Hands-on experience with Databricks notebooks and Databricks environments

Preferred Qualifications :

- Experience with Databricks Unity Catalog

- Exposure to Spark / PySpark

- Knowledge of data governance, metadata management, RBAC

- Cloud exposure : AWS / Azure / GCP


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in