HamburgerMenu
hirist

Job Description

Description :


Role : Test Engineer Data


Mandatory Skills : Python proficiency, SQL Queries, DBT Testing


Purpose :


Own the end-to-end data validation strategy for the migration ensuring that every model in the new Snowflake/dbt stack is provably equivalent to the legacy SQL Server/SSIS/Excel baseline before any cutover decision is made. Acts as the quality gate between legacy decommission and production go-live.


Role Summary :


The Data Quality / Test Engineer is responsible for ensuring the migrated platform produces trusted and accurate data outputs. This role defines and executes the testing strategy across ingestion, transformation, semantic layers, and reporting outputs. It plays a critical role in source-to-target reconciliation, defect identification, UAT support, and parallel-run validation, especially where finance and leadership reporting require a high level of trust.


KEY RESPONSIBILITIES :


- Design and maintain the migration validation framework: row-count reconciliation, aggregate hash comparison, and business-rule assertion libraries


- Build automated reconciliation scripts (Python + SQL) comparing SQL Server source outputs against Snowflake Silver and Gold layer equivalents


- Define the end-to-end testing strategy for data ingestion, transformation, metrics, and reporting.


- Create and execute test cases for source-to-target reconciliation, business rules, and output validation.


- Validate completeness, accuracy, freshness, and consistency of data across the target platform.


- Support system integration testing, user acceptance testing, regression testing, and cutover readiness.


- Identify, document, and track defects through resolution.


- Help define ongoing production data quality controls after go-live.


- Implement and maintain dbt test suites using schema tests, custom SQL assertions for column-level diffs


- Define data quality SLAs by domain and track SLA adherence


- Manage the cutover readiness checklist a signed-off artifact required before any production cutover window is opened


- Set up ongoing data quality alerting post-go-live using dbt test failures


- Build and maintain a reconciliation dashboard in Sigma for business stakeholders to validate outputs visually


- Conduct root-cause analysis on all data quality failures found during UAT and track resolution to closure


- Document test coverage gaps and escalate untested business rules to the dbt Engineering Lead for model updates


REQUIRED SKILLS & EXPERIENCE :


- Experience with data migration testing, reconciliation, and data quality validation.


- Strong SQL and analytical validation skills.


- Experience testing modern cloud data platforms and reporting solutions.


- 4+ years in data engineering or analytics engineering with a focus on data quality and testing


- Python proficiency for scripting bulk reconciliation jobs and generating validation reports


- Attention to precision in financial data understanding of decimal type handling, rounding, and floating-point risks


- Ability to work with both technical teams and business users.


NICE TO HAVE :


- Experience with programmatic data quality frameworks


- Familiarity with Snowflake TIME TRAVEL for point-in-time validation during parallel-run periods


- Background in financial services or regulated industry data validation


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in