Posted on: 14/03/2026
Job Overview :
We are looking for a skilled ETL Tester with hands-on experience in data validation, ETL/ELT testing, SQL, and Python automation. The ideal candidate should have strong knowledge of data warehousing concepts, ETL architecture, and automated testing frameworks for validating large-scale data pipelines.
The role requires working closely with data engineers, developers, and business teams to ensure the accuracy, quality, and integrity of data across the data platform.
Mandatory Skills :
ETL / ELT Testing :
- Hands-on experience in S2T reconciliation, transformation validation, and data reconciliation.
- Experience designing regression strategies for data pipelines.
Advanced SQL :
- Expertise in joins, CTEs, window functions, and aggregations.
- Ability to write performance-optimized queries for large datasets.
Python Automation :
- Strong experience with Python automation and Pytest.
- Ability to build and maintain automated data validation test suites.
- ETL Architecture & Data Warehouse Concepts
- Strong understanding of ETL/ELT data flows.
- Knowledge of staging, ODS, and data warehouse architecture.
- Understanding of data quality dimensions.
Test Management :
- Experience using Jira and/or X-Ray for test management and defect tracking.
- Agile Methodology
- Experience working in Agile/Scrum development environments.
Must-Have Technical Skills :
- Strong Unix/Linux command knowledge for log validation, job monitoring, and troubleshooting.
- Strong Python fundamentals including :
- Data structures (lists, tuples, sets, dictionaries)
- Object-Oriented Programming
- File handling and debugging
- Database connectivity
- Experience with Pandas / DataFrames and lambda functions for data transformations and checks.
- Basic exposure to cloud environments (AWS preferred).
- Experience with Boto3 is an added advantage.
- Strong communication and documentation skills to explain data issues to both technical and business teams.
Good to Have Skills :
- Understanding of Databricks and PySpark, especially for testing Lakehouse architectures.
- Experience with cloud data platforms such as:
- Snowflake
- Databricks
- BigQuery
- Redshift
- Synapse
- Experience with RDBMS such as PostgreSQL, SQL Server, or Oracle.
- ETL / ELT Tools
- Informatica
- DataStage
- SSIS
- Talend
- dbt
- Airflow
- Data Quality Tools
- Great Expectations
- Deequ
- Knowledge of metadata and data lineage concepts.
- CI/CD & Reporting
- Allure
- JUnit XML
- Environment and release management
- API Testing
- Postman
- REST API validation for data services
Key Responsibilities :
- Perform hands-on ETL/ELT testing including S2T reconciliation, transformation validation, and data reconciliation.
- Design and execute regression testing strategies for data pipelines.
- Write and optimize complex SQL queries (joins, CTEs, window functions, aggregations) for large datasets.
- Develop and maintain automation scripts using Python and Pytest for data validation.
- Validate ETL workflows and data warehouse layers including staging, ODS, and warehouse environments.
- Track and manage defects using Jira or X-Ray, ensuring proper documentation and closure.
- Work in Agile/Scrum environments supporting sprint activities and story-level testing.
- Collaborate with developers and data engineers to identify and resolve data quality issues.
- Document test results and provide insights on data discrepancies and system behavior.
Did you find something suspicious?
Posted by
Posted in
Quality Assurance
Functional Area
QA & Testing
Job Code
1620716