HamburgerMenu
hirist

Job Description

Job Title : Data Engineer Microsoft Fabric & Lakehouse

Location : Hybrid Bangalore, India

Experience : 5+ Years

Joining: Immediate

Hiring Process : One interview + One case study round

Key Responsibilities :


Data Pipeline Development :


- Design, develop, and maintain robust data pipelines using Microsoft Fabric components including Dataflows, Pipelines, Notebooks (PySpark), and Lakehouse architecture.


- Ingest, cleanse, and transform structured and semi-structured data from diverse sources (cloud, on-prem, APIs, SaaS platforms).

Lakehouse Architecture Implementation :


- Manage Delta Tables across bronze, silver, and gold layers adhering to the medallion architecture in OneLake or equivalent lakehouse environments.

- Implement metadata-driven design patterns to enhance pipeline reusability and scalability.


Data Modeling & BI Integration :


- Build and maintain semantic models for business reporting.

- Define standardized KPIs and metrics for consumption in Power BI or equivalent visualization tools.

Performance Monitoring & Optimization :


- Monitor data pipelines for performance, reliability, and failures.

- Implement logging, monitoring, alerting, and observability tools to ensure end-to-end pipeline health.

Data Governance & Quality :


- Enforce data governance best practices including schema versioning, lineage tracking, RBAC, and audit trails.

- Conduct robust data quality checks (null checks, outlier detection, duplicate handling, schema drift, SCD management).

Required Skills & Qualifications :


- 2- 5 years of hands-on experience in Data Engineering, Data Warehousing, or related roles.

- Strong experience with PySpark, SQL, and data development using Fabric notebooks and pipelines.

- Deep understanding of Lakehouse architectures and experience with Microsoft Fabric, OneLake, or similar platforms like Databricks, Snowflake, Azure Synapse.

- Proven experience with Delta Lake or equivalent transactional storage technologies.

- Experience integrating data from SaaS platforms (e.g., Salesforce, Workday), REST APIs, and enterprise databases.

- Working knowledge of Power BI, Tableau, or similar BI tools.

- Solid grasp of data governance principles, including lineage, RBAC, and data quality frameworks.

- Strong problem-solving, analytical, and communication skills.


info-icon

Did you find something suspicious?