HamburgerMenu
hirist

Lakehouse Data Engineer - Microsoft Fabric

Aliqan Services Private Limited
Bangalore
2 - 5 Years

Posted on: 20/07/2025

Job Description

Job Title : Lakehouse Data Engineer Microsoft Fabric

Location : Hybrid Bangalore, India

Experience : 2 to 5 Years

Joining : Immediate

Hiring Process : One interview + One case study round

About the Role :

We are looking for a skilled Lakehouse Data Engineer Microsoft Fabric with 25 years of experience to join our dynamic team. The ideal candidate will be responsible for designing and developing scalable, reusable, and efficient data pipelines using modern Data Engineering platforms such as Microsoft Fabric, PySpark, and Data Lakehouse architectures.

You will play a key role in integrating data from diverse sources, transforming it into actionable insights, and ensuring high standards of data governance and quality. This role requires a strong understanding of modern data architectures, pipeline observability, and performance optimization.

Key Responsibilities :

- Design and build robust data pipelines using Microsoft Fabric components including Pipelines, Notebooks (PySpark), Dataflows, and Lakehouse architecture.

- Ingest and transform data from a variety of sources such as cloud platforms (Azure, AWS), on-prem databases, SaaS platforms (e.g., Salesforce, Workday), and REST/OpenAPI-based APIs.

- Develop and maintain semantic models and define standardized KPIs for reporting and analytics in Power BI or equivalent BI tools.

- Implement and manage Delta Tables across bronze/silver/gold layers using Lakehouse medallion architecture within OneLake or equivalent environments.

- Apply metadata-driven design principles to support pipeline parameterization, reusability, and scalability.

- Monitor, debug, and optimize pipeline performance; implement logging, alerting, and observability mechanisms.

- Establish and enforce data governance policies including schema versioning, data lineage tracking, role-based access control (RBAC), and audit trail mechanisms.

- Perform data quality checks including null detection, duplicate handling, schema drift management, outlier identification, and Slowly Changing Dimensions (SCD) type management.

Required Skills & Qualifications :

- 25 years of hands-on experience in Data Engineering or related fields.

- Solid understanding of data lake/lakehouse architectures, preferably with Microsoft Fabric or equivalent tools (e.g., Databricks, Snowflake, Azure Synapse).

- Strong experience with PySpark, SQL, and working with dataflows and notebooks.

- Exposure to BI tools like Power BI, Tableau, or equivalent for data consumption layers.

- Experience with Delta Lake or similar transactional storage layers.

- Familiarity with data ingestion from SaaS applications, APIs, and enterprise databases.

- Understanding of data governance, lineage, and RBAC principles.

- Strong analytical, problem-solving, and communication skills.


info-icon

Did you find something suspicious?