Posted on: 02/12/2025
Description :
Key Role and Responsibilities :
- Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.
- Build and maintain data integration workflows from various data sources to Snowflake.
- Write efficient and optimized SQL queries for data extraction and transformation.
- Work with stakeholders to understand business requirements and translate them into technical solutions.
- Monitor, troubleshoot, and optimize data pipelines for performance and reliability.
- Maintain and enforce data quality, governance, and documentation standards.
- Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment.
Must Have :
- Strong experience with Azure Cloud Platform services.
- Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.
- Proficiency in SQL for data analysis and transformation.
- Hands-on experience with Snowflake and SnowSQL for data warehousing.
- Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.
- Experience working in cloud-based data environments with large-scale datasets.
Did you find something suspicious?
Posted By
Devendra Karlekar
Associate Recruitment Consultant at EXL Services.com ( I ) Pvt. Ltd.
Last Active: 3 Dec 2025
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1583572
Interview Questions for you
View All