Posted on: 16/04/2026
Description :
- Design and implement end-to-end data pipelines using Azure Data Factory and/or Microsoft Fabric pipelines
- Build and maintain Medallion Architecture (Bronze, Silver, Gold layers) for scalable data processing
- Develop a semantic layer on top of curated data to support business reporting and analytics
- Transform raw data into clean, structured, and business-ready datasets
- Integrate data from multiple sources (SAP, SQL Server, APIs, CSV, etc.)
- Design and implement data models (Star Schema / Dimensional Modeling)
- Develop and optimize Power BI reports and dashboards
- Ensure data quality, governance, and security best practices
- Monitor and optimize pipeline performance and data refresh cycles.
- Troubleshoot and resolve production issues and data failures
- Collaborate with business stakeholders to understand and translate requirements into technical
solutions
Required Skills :
- Strong experience with Azure Data Factory (ADF) and/or Microsoft Fabric Pipelines
- Hands-on experience with Azure SQL, Lakehouse, Delta Tables
- Expertise in Medallion Architecture (Bronze, Silver, Gold)
- Strong experience in building semantic layers for reporting (Power BI datasets/models)
- Solid understanding of ETL/ELT processes
- Advanced knowledge of Data Modeling (Star Schema, Fact & Dimension tables)
- Strong SQL skills
- Experience with Power BI (DAX, Power Query, performance tuning)
- Experience integrating data from SAP or similar ERP systems
Good to Have :
- Experience with Delta Lake and Change Data Feed (CDF)
- Knowledge of Python / PySpark
- Experience with CI/CD pipelines (Azure DevOps)
- Exposure to large-scale production data environments
- Understanding of data governance frameworks
Experience Required :
- 5+ years of experience in Data Engineering / BI / Analytics
- 2+ years of experience working with Azure Data Platform or Microsoft Fabric
Education :
- Bachelors or Masters degree in Computer Science, Information Technology, or related field
Soft Skills :
- Strong problem-solving and analytical skills
- Ability to handle critical production environments and incidents
- Good communication skills with both technical and non-technical stakeholders
- Ability to work in a collaborative, fast-paced environment
Nice to Have (Project-Specific) :
- Experience handling production breakdowns and system rebuilds
- Experience working in global teams (EST/IST time zones)
- Exposure to data modernization projects
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1629104