Posted on: 14/01/2026
Description :
- Design, build, and maintain robust, scalable ETL processes for data acquisition, transformation, and delivery;
- Collaborate with Data Analysts, Architects, and Compliance stakeholders to translate requirements into efficient ETL solutions;
- Ensure data quality, integrity, lineage, and traceability across large-scale datasets;
- Optimize ETL workflows for performance, scalability, and maintainability;
- Perform data profiling, troubleshooting, and root cause analysis to resolve pipeline or quality issues;
- Deploy, monitor, and support ETL jobs across development, QA, and production environments;
- Contribute to the development of technical standards, documentation, and operational best practices.
What youll bring :
- Hands-on ETL development experience in enterprise data environments;
- Proven expertise in designing and optimizing complex data pipelines;
- Strong understanding of data modeling, data integration, and data warehousing concepts;
- Experience with modern ETL tools (e.g., Informatica, Talend, DataStage, or equivalent);
- Advanced SQL skills and familiarity with scripting (Python, Shell, etc.) for workflow automation;
- Knowledge of data governance, quality, and lineage frameworks;
- Experience working in regulated industries (financial services or similar) is a strong plus;
- Excellent performance tuning and troubleshooting skills;
- Strong communication skills and proven success in agile, cross-functional teams.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1601146