Posted on: 14/01/2026
About Kadel Labs :
Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and automation tasks.
Position : Data Quality Analyst (DQA)
Location : Udaipur / Jaipur
Experience : 3-6 years
We are seeking a Data Quality Analyst (DQA) to support a critical data migration program involving migration from Azure Data Lake Storage (ADLS) to MongoDB using .NET-based services. The role focuses on validating data integrity, accuracy, and completeness across source, intermediate, and target systems using SQL, Python, and Databricks.
Key Responsibilities :
- Validate data migration from ADLS to MongoDB across multiple stages
- Perform source-to-target data reconciliation including:
- Record counts
- Schema and data type validation
- Business rule validation
- Duplicate and null checks
- Write SQL queries to validate data in source and intermediate layers
- Develop Python scripts for automated data validation and comparisons
- Use Databricks notebooks for data profiling, validation, and analysis
- Validate outputs generated by .NET-based migration pipelines
- Identify, log, and track data quality defects with clear root cause analysis
- Collaborate with Data Engineering, .NET, and QA teams to resolve issues
- Support UAT, regression testing, and migration sign-off
- Prepare data quality reports, test cases, and validation documents
Required Technical Skills :
Data & Validation Skills :
- Strong SQL skills (joins, aggregations, filters, subqueries)
- Hands-on experience with Python for data validation and scripting
- Solid understanding of data quality dimensions (accuracy, completeness, consistency)
Platform & Tools :
- Working knowledge of Databricks (notebooks, Spark basics, Spark SQL)
- Experience with Azure Data Lake Storage (ADLS Gen2)
- Basic understanding of MongoDB (collections, documents, querying)
- Conceptual understanding of .NET-based data migration services
Testing Skills :
- Strong understanding of testing fundamentals
Experience in :
- Data migration testing
- Data validation and reconciliation testing
- Integration and regression testing
- Defect tracking using Azure DevOps or similar tools
- Ability to create data-driven test cases and test scenarios
Good to Have :
- Prior experience in data migration projects (cloud or on-prem to cloud)
- Exposure to NoSQL database validation (MongoDB preferred)
- Knowledge of Spark SQL and performance validation
- Experience working in Agile / Scrum environments
- Familiarity with data profiling or data quality frameworks
Soft Skills :
- Strong analytical and problem-solving abilities
- Clear communication and documentation skills
- Ability to work independently in fast-paced migration programs
- Strong collaboration skills across cross-functional teams
- Key Value Add
- This role offers hands-on exposure to Azure, Databricks, MongoDB, and modern data migration
architectures, with the opportunity to work on large-scale enterprise data transformation initiatives.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
QA & Testing
Job Code
1601265