HamburgerMenu
hirist

Job Description

Key Responsibilities :


- Design and implement end-to-end ETL (Extract, Transform, Load) solutions leveraging Microsoft Fabric capabilities.

- Utilize Azure Data Factory (ADF) or Fabric Data Pipelines to create, configure, and manage Pipelines, datasets, dataflows, and Integration Runtimes.

- Develop solutions for data extraction, transformation, and processing using Azure Databricks (PySpark/Spark SQL) to handle large datasets.

- Create and execute complex SQL scripts and queries to perform data analysis, aggregation, and validation in Azure SQL and Azure Synapse Analytics.

- Develop Synapse pipelines or Fabric Data Pipelines to orchestrate data movement and migration, particularly from Azure Data Lake Gen2 to Azure SQL.

- Execute data migration pipelines to seamlessly move data to the Azure cloud, specifically to Azure SQL.

- Perform database migration from on-premises SQL Server to Azure development environments using tools like Azure Data Migration Service (DMS) and the Data Migration Assistant (DMA).

- Apply experience across different data processing paradigms, including Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions.

- Ensure data discoverability and governance by leveraging tools like Azure Data Catalog or Fabric's equivalent capabilities.

- Implement effective monitoring for data pipelines and trigger runs within Azure Data Factory or Fabric to ensure operational stability and timely issue resolution.


Required Qualifications :


- Experience : 3 to 5 years of hands-on experience in Data Engineering.

- Microsoft Data Platform : Strong practical experience with key Azure services including Microsoft Fabric, Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, and Azure SQL.

- ETL/ELT Proficiency : Proven ability to create, deploy, and manage data integration processes using ADF/Fabric pipelines, dataflows, and integration runtimes.

- SQL Expertise : Advanced proficiency in writing and optimizing SQL scripts for complex queries and data manipulation.

- Data Migration Tools : Experience using Azure Data Migration Service (DMS) and the Data Migration Assistant (DMA) for cloud migration projects.

- Data Processing Concepts : Solid understanding of concepts related to Big Data Batch, Interactive, and Real-Time processing solutions.

- Cloud Concepts : Familiarity with Azure storage solutions (Azure Data Lake Gen2) and data governance tools (Azure Data Catalog).


info-icon

Did you find something suspicious?