HamburgerMenu
hirist

Job Description

Job Purpose :


We are seeking a highly skilled and detail-oriented ETL Testing Specialist to join our data quality assurance team.
In this role, you will be responsible for ensuring the accuracy, integrity, and performance of data as it moves through various Extract, Transform, Load (ETL) processes.
You will play a critical role in validating data transformations, verifying data loads into data warehouses/lakes, and ensuring that our data solutions meet stringent business requirements and quality standards.



Key Responsibilities :



Test Strategy & Planning :


- Understand ETL design and architecture to develop comprehensive test strategies and detailed test plans for data warehousing and business intelligence projects.


- Analyze business requirements, source-to-target mappings (STTMs), and data models to identify appropriate test scenarios and data validation rules.



Test Case Design & Execution :


- Design, develop, and execute robust ETL test cases, SQL queries, and scripts to validate data extraction, transformation logic, and loading processes.


- Perform data validation across various stages of the ETL pipeline, including source data verification, data transformation validation, and target data reconciliation.


- Conduct data integrity testing, data completeness testing, data accuracy testing, and performance testing for ETL jobs.



Data Validation & Reconciliation :


- Write complex SQL queries to compare source and target data, identify discrepancies, and validate data transformations.


- Reconcile data counts, aggregates, and specific data points between source systems and data warehouse/data lake environments.


- Ensure that all data is correctly loaded into the target systems as per business rules and technical specifications.



Defect Management :


- Identify, log, track, and retest defects using defect management tools, providing clear and concise documentation of issues.


- Collaborate with ETL developers, data architects, and business analysts to facilitate timely resolution of identified defects.



Automation & Scripting :


- Contribute to the development and maintenance of automated ETL testing scripts and frameworks to improve efficiency and coverage.


- Utilize scripting languages (e.g., Python, Shell Scripting) for data comparison and validation tasks where applicable.



Reporting & Documentation :


- Prepare and present detailed test reports, including test results, defect summaries, and overall data quality assessments.


- Maintain comprehensive documentation of test cases, test data, test results, and ETL processes.



Collaboration & Communication :


- Work closely with data engineers, developers, business analysts, and project managers throughout the data lifecycle.


- Participate in requirement gathering sessions, design reviews, and provide valuable input from a testing perspective.



Required Skills & Qualifications :



Experience : 3 - 5 years of dedicated experience in ETL Testing, Data Warehouse Testing, or Data Quality Assurance.



Database & SQL Expertise :


- Strong proficiency in writing complex SQL queries for data validation, reconciliation, and analysis.


- Hands-on experience with relational databases (e.g., Oracle, SQL Server, PostgreSQL, MySQL) and understanding of database concepts.



ETL Concepts : Solid understanding of ETL processes, data warehousing concepts (e.g., Star Schema, Snowflake Schema, SCDs), and data modeling.



ETL Tools (Conceptual) : Familiarity with common ETL tools (e.g., Informatica PowerCenter, Talend, DataStage, SSIS) from a testing perspective (understanding their output and how to validate it).



Testing Methodologies : Experience with various testing methodologies (e.g., Agile, Waterfall) and the software development lifecycle (SDLC).



Test Management Tools : Hands-on experience with test management and defect tracking tools (e.g., JIRA, Azure DevOps, ALM Quality Center).



Analytical Skills : Strong analytical and problem-solving skills with a keen eye for detail and data discrepancies.



Education : Bachelor's degree in Computer Science, Information Technology, Engineering, or a related quantitative field.



Nice to Have :


- Experience with cloud data platforms (e.g., AWS Redshift, Azure Synapse, Google BigQuery, Snowflake).


- Familiarity with big data technologies (e.g., Hadoop, Spark).


- Scripting experience (e.g., Python, Shell scripting) for automation.


- Knowledge of BI reporting tools (e.g., Tableau, Power BI) for validating report data.


- Relevant certifications in SQL, Data Warehousing, or Testing.



Soft Skills :


- Excellent verbal and written communication skills.


- Strong interpersonal skills and ability to work effectively in a collaborative team environment.


- Proactive and self-motivated with a strong sense of ownership.


- Ability to manage multiple tasks and prioritize effectively in a fast-paced environment.


- High level of attention to detail and commitment to data quality


info-icon

Did you find something suspicious?