Posted on: 28/01/2026
Description :
Experience : 7 - 10 Years
Location : Bangalore
Industry : US-based Consulting & Advisory
Education : Bachelors degree in Computer Science, Data Engineering, or a related technical field
Role Summary :
We are seeking a high-caliber Senior Azure Data Engineer to join a globally recognized US-based Consulting and Advisory firm. In this role, you will act as a "Cloud Data Architect," designing and implementing enterprise-scale data solutions within the Microsoft Fabric and Azure ecosystems.
You will leverage Python/PySpark and SQL to build robust ELT/ETL pipelines, specifically utilizing OneLake, Lakehouses, and Synapse Pipelines. The ideal candidate possesses the technical depth to manage Databricks/Delta Lake environments and the "DevOps Maturity" to automate infrastructure using Terraform or Bicep. You will be responsible for ensuring data governance, lineage, and cost-optimized performance across cutting-edge cloud architectures.
Responsibilities :
- Fabric & Lakehouse Architecture : Lead the implementation of Microsoft Fabric components, including OneLake, Lakehouses, and Warehouses, utilizing Direct Lake mode for high-performance Power BI reporting.
- Advanced Pipeline Engineering : Design and deploy complex ETL/ELT workflows using Azure Data Factory (ADF), Synapse Pipelines, and Fabric Data Pipelines.
- Data Wrangling & Processing : Develop high-performance data transformation logic using Python (PySpark) and SQL, specifically optimized for large-scale data wrangling and notebook development.
- Delta Lake & Databricks : Implement and maintain Lakehouse architectures using Delta Lake and Databricks, ensuring ACID compliance and efficient data versioning.
- Performance Tuning : Perform deep-dive SQL optimization and PySpark execution plan analysis to enhance data processing speed and reduce compute costs.
- Infrastructure as Code (IaC) : Automate the provisioning of Azure services using Terraform or Bicep, following modular and reusable code practices.
- DevOps & CI/CD Governance : Build and manage YAML-based pipelines for automated testing, integration, and continuous deployment of data artifacts.
- Data Governance & Lineage : Implement frameworks for data quality, unit testing, and end-to-end lineage to ensure transparency and compliance across the advisory firms data assets.
- Azure Foundational Security : Manage Azure storage (ADLS Gen2), networking (Private Links), and security (Key Vault, RBAC) while adhering to strict cost-management protocols.
- Technical Mentorship : Guide junior engineers in best practices for notebook development, code modularization, and cloud-native security.
Technical Requirements :
- Core Cloud Skills : 710 years of experience in data engineering, with at least 5 years focused on the Microsoft Azure stack.
- Programming Mastery : Expert-level proficiency in Python/PySpark and Advanced SQL (Performance Tuning, CTEs, Window Functions).
- Fabric Ecosystem : Proven project experience with Microsoft Fabric (OneLake, Lakehouse, Power BI Direct Lake).
- ETL Toolkit : Hands-on expertise in Azure Data Factory (ADF) and Synapse Analytics.
- Automation & DevOps : Strong experience with Terraform/Bicep and Azure DevOps (YAML pipelines).
Preferred Skills :
- Delta Lake Expertise : Strong familiarity with Databricks environments and the Medallion (Bronze/Silver/Gold) architecture.
- Consulting Experience : Prior experience in a US-based consulting environment, managing global stakeholder requirements.
- Data Visualization : Basic understanding of Power BI to optimize data models for end-user consumption.
Core Competencies :
- Architectural Rigor : Ability to build "future-proof" data platforms that balance scalability with cost-efficiency.
- Analytical Problem-Solving : A methodical approach to troubleshooting complex pipeline failures or data inconsistencies.
- Strategic Communication : Strong verbal and written skills to present technical roadmaps to global advisory leads.
- Quality Focus : An uncompromising commitment to data accuracy, unit testing, and robust documentation.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1606599