Posted on: 22/07/2025
Position : MS Fabric Solution Engineer
Experience : 5+ Years
Location : Noida(Hybrid)
Job Description :
MS Fabric Solution Engineer
Key Responsibilities :
- Lead the technical design, architecture, and hands-on implementation of Microsoft Fabric PoCs. This includes translating business needs into effective data solutions, often applying Medallion Architecture principles within the Lakehouse..
- Develop and optimize ELT/ETL pipelines for diverse data sources:
- Static data (e.g., CIM XML, equipment models, Velocity Suite data).
- Streaming data (e.g., measurements from grid devices, Event Hub and IoT Hub).
- Seamlessly integrate Fabric with internal systems (e.g., CRM, ERP) using RESTful APIs, data mirroring, Azure Integration Services, and CDC (Change Data Capture) mechanisms.
- Hands-on configuration and management of core Fabric components: OneLake, Lakehouse, Notebooks (PySpark/KQL), and Real-Time Analytics databases.
- Facilitate data access via GraphQL interfaces, Power BI Embedded, and Direct Lake connections, ensuring optimal performance for self-service BI and adhering to RLS/OLS.
- Work closely with Microsoft experts, SMEs, and stakeholders.
- Document architecture, PoC results, and provide recommendations for production readiness and data governance (e.g., Purview integration).
Required Skills & Experience :
- 5-10 years of experience in Data Engineering / BI / Cloud Analytics, with at least 1-2 projects using Microsoft Fabric (or strong Power BI + Synapse background transitioning to Fabric).
Proficient in :
- OneLake, Data Factory, Lakehouse, Real-Time Intelligence, Dataflow Gen2
- Ingestion using CIM XML, CSV, APIs, SDKs
- Power BI Embedded, GraphQL interfaces
- Azure Notebooks / PySpark / Fabric SDK
- Experience with data modeling (asset registry, nomenclature alignment, schema mapping).
- Familiarity with real-time streaming (Kafka/Kinesis/IoT Hub) and data governance concepts.
- Strong problem-solving and debugging skills.
- Prior experience with PoC/Prototype-style projects with tight timelines.
Good to Have :
- Knowledge of grid operations / energy asset management systems.
- Experience working on Microsoft-Azure joint engagements.
- Understanding of AI/ML workflow integration via Azure AI Foundry or similar.
- Relevant certifications: DP-600/700 or DP-203.
- OneLake, Data Factory, Lakehouse, Real-Time Intelligence, Dataflow Gen2
- Ingestion using CIM XML, CSV, APIs, SDKs
- Power BI Embedded, GraphQL interfaces
- Azure Notebooks / PySpark / Fabric SDK
- Experience with data modeling (asset registry, nomenclature alignment, schema mapping).
- Familiarity with real-time streaming (Kafka/Kinesis/IoT Hub) and data governance concepts.
- Strong problem-solving and debugging skills.
- Prior experience with PoC/Prototype-style projects with tight timelines.
Good to Have :
- Knowledge of grid operations / energy asset management systems.
- Experience working on Microsoft-Azure joint engagements.
- Understanding of AI/ML workflow integration via Azure AI Foundry or similar.
- Relevant certifications: DP-600/700 or DP-203.
Did you find something suspicious?
Posted By
Posted in
Backend Development
Functional Area
Backend Development
Job Code
1517523
Interview Questions for you
View All