Posted on: 22/09/2025
About the Job :
Key Responsibilities :
- Design, develop, and implement efficient ETL processes to support data integration and business intelligence needs.
- Optimize ETL performance and ensure scalability of data processes.
- Ensure data quality, consistency, and reliability across systems.
- Collaborate with data architects, analysts, and business stakeholders to gather requirements and deliver
solutions.
- Perform unit testing and support UAT to validate ETL processes.
- Maintain documentation for ETL processes, workflows, and data mappings.
- Monitor ETL jobs and proactively resolve failures or performance issues.
- Contribute to continuous improvement by recommending new tools, technologies, and best practices.
Core Skills & Requirements :
- Proficiency in SQL and relational database management systems (Oracle, MySQL, SQL Server, PostgreSQL, etc.)
- Solid understanding of data warehousing concepts, data modeling, and schema design.
- Experience in handling large datasets, data transformation, and integration.
- Knowledge of performance tuning and optimization in ETL workflows.
- Familiarity with scripting languages (Python, Shell, or similar) for automation.
- Good understanding of cloud data platforms (AWS, Azure, GCP) is a plus
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1550646
Interview Questions for you
View All