HamburgerMenu
hirist

Klber Lubrication - Data Engineer - ETL/Data Warehousing

Kluber Lubrication
Bangalore
3 - 6 Years

Posted on: 24/11/2025

Job Description

Role Overview :

The Data Engineer position requires a candidate with 3+ years of experience in Data Management or Data Engineering, focused on building robust and scalable data platforms.

Based in Bangalore with a Hybrid Work model at Kl- ber Lubrication India Pvt. Ltd., this role is crucial for leveraging modern Microsoft cloud technologies to enhance data infrastructure.

The incumbent must possess Expertise in T-SQL and practical experience with the Microsoft Fabric ecosystem for end-to-end data solutions.

Job Summary :

We are seeking an experienced Data Engineer (3+ years) with a strong educational background in IT or Computer Science to specialize in the Microsoft Azure and Fabric stack. The ideal candidate will have expertise in T-SQL for complex query optimization and hands-on practical experience developing and operating data pipelines using Microsoft Fabric components (OneLake, Synapse, Data Factory). Key responsibilities include deploying ETL/ELT solutions, utilizing Python/PowerShell for automation, applying DevOps methodologies, and collaborating across departments to ensure high-quality, reliable data delivery.

Key Responsibilities and Technical Deliverables :

Data Pipeline Development and Operations :

- Demonstrate a Proven track record of developing and deploying ETL/ELT pipelines, Data Warehouse solutions, or other modern data platforms.

- Apply Practical experience with Microsoft Fabric (OneLake, Synapse, Data Factory) for building scalable and reliable data ingestion, transformation, and integration workflows.

Database Querying and Optimization :

- Utilize Expertise in T-SQL for the development of complex stored procedures, views, functions, optimization of existing queries, and troubleshooting performance bottlenecks within relational databases.

Automation and DevOps :

- Leverage Basic scripting skills in Python and PowerShell for automation of deployment processes, monitoring tasks, and data management operations.

- Apply Knowledge of DevOps methodologies and monitoring to ensure CI/CD practices are integrated into the data platform lifecycle and maintain high operational reliability.

Quality, Collaboration, and Problem Solving :

- Utilize Strong analytical and problem-solving skills to diagnose data quality issues and complex data integration challenges.

- Work with teams across departments (Business, Analytics, IT) to gather requirements and deliver data solutions that align with business needs.

- Maintain Strong attention to detail and commitment to quality in all documentation and code.

Mandatory Skills & Qualifications :

- Experience : 3+ years of experience in Data Management / Data Engineering.

- Education : Degree in Information Technology, Computer Science, or a related field.

- Database/SQL : Expertise in T-SQL for complex query development, optimization, and troubleshooting.

- Cloud Platform : Practical experience with Microsoft Fabric (OneLake, Synapse, Data Factory).

- ETL/Data Warehousing : Proven track record of developing and deploying ETL pipelines, Data Warehouse solution or other data platforms.

- Automation : Basic scripting skills in Python and PowerShell for automation.

- Methodology : Knowledge of DevOps methodologies and monitoring.

- Soft Skills : Strong analytical and problem-solving skills and ability to work independently and as part of a team.

Preferred Skills :

- Experience with Azure Data Services outside of Fabric (e.g., Azure Data Lake Storage Gen2, Azure Databricks).

- Knowledge of dimensional modeling (Kimball methodology).

- Prior experience in the manufacturing or industrial sector.


info-icon

Did you find something suspicious?