Posted on: 05/10/2025
Description :
Key Responsibilities :
- ETL Pipeline Development : Designing, developing, and implementing complex ETL processes and data integration solutions using Azure Data Factory.
- Data Modeling and Architecture : Designing and implementing scalable data models and architectures within the Microsoft Azure cloud environment.
- Cloud Integration : Working with other Azure services such as Azure Data Lake Storage, Azure SQL Database, Azure Synapse Analytics, and Azure Databricks.
- Data Requirements Translation : Collaborating with data analysts, scientists, and business stakeholders to understand their data requirements and translate them into technical solutions.
- Data Quality and Security : Ensuring the quality, security, and privacy of data throughout the data lifecycle.
- Monitoring and Performance : Monitoring and managing pipeline performance, setting up alerts for proactive issue resolution, and optimizing operational productivity.
Required Skills :
- Azure Data Factory (ADF) : Deep knowledge and experience in using ADF for data orchestration and transformation.
- SQL : Advanced proficiency in SQL for database management and data manipulation.
- Programming Languages: Experience with programming languages like Python, Scala, or PowerShell for data engineering tasks and automation.
- Cloud Services : Strong understanding of the Microsoft Azure cloud platform, including data storage, database, and analytics services.
- Data Warehousing/Modeling : Familiarity with data warehousing concepts and data modeling.
- ETL Concepts : A solid understanding of ETL processes and related tools.
- Problem-Solving : Excellent analytical and troubleshooting skills to resolve complex technical challenges.
- Communication : Strong communication and collaboration skills to work effectively with various teams and stakeholders.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1555513
Interview Questions for you
View All