Posted on: 04/12/2025
About the Role :
We are looking for a skilled ETL Engineer to join our Data & Analytics team. The ideal candidate will be responsible for designing, developing, and maintaining robust ETL pipelines that enable efficient data movement, transformation, and integration across systems.
This role is ideal for someone who enjoys working with large datasets, optimizing SQL queries, and building scalable solutions in a cloud environment.
Key Responsibilities :
- Design, develop, and maintain ETL workflows and data pipelines using SQL and SSIS (SQL Server Integration Services).
- Manage and optimize data extraction, transformation, and loading processes across multiple data sources.
- Implement ETL processes on Azure (Data Factory, Data Lake, or Synapse) ensuring scalability, reliability, and performance.
- Collaborate with data analysts, product teams, and business stakeholders to understand data needs and deliver clean, structured datasets.
- Ensure data quality, integrity, and consistency across systems.
- Monitor ETL jobs and troubleshoot issues to ensure smooth data operations.
- Architect and manage high-performance analytical data stores using SSAS, including developing robust cubing strategies for complex datasets.
- Containerize data processing applications using Docker and deploy them onto managed orchestration platforms like Azure Kubernetes Service (AKS), demonstrating a strong grasp of core Kubernetes concepts.
- Participate in design discussions and contribute to continuous improvement of data architecture and standards.
Required Skills and Experience :
- 3 - 5 years of hands-on experience in ETL development and data integration.
- Strong proficiency in SQL (complex queries, performance tuning, stored procedures).
- Solid experience with SSIS - designing, deploying, and optimizing ETL packages.
- Experience in ETL job monitoring, troubleshooting, and performing timely fixes to ensure smooth data operations.
- Experience working with Microsoft Azure data services such as Azure Data Factory, Azure SQL Database, or Azure Synapse.
- Implement and optimize dimensional models using Star and Snowflake Schemas to support business intelligence and analytical reporting.
- Good understanding of data warehousing concepts, star/snowflake schema, and data modelling.
- Strong problem-solving and analytical skills with attention to detail.
Good to Have :
- Experience with Tableau or other BI/visualization tools for data presentation and insights.
- Familiarity with Python or PowerShell for automation and scripting.
- Exposure to CI/CD for ETL pipelines or DevOps environments.
- Advise on and implement modern data architectural patterns, such as the Medallion Architecture, to manage data governance and accessibility.
Soft Skills :
- Self-motivated, with a proactive approach to learning and problem-solving.
- Ability to handle multiple priorities and deliver high-quality results within deadlines.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1584786
Interview Questions for you
View All