Posted on: 26/09/2025
Position : Data Engineer - Azure
Experience : 4 to 8 Years
Location : Bangalore
Job Type : Full-time
Job Summary :
We are seeking a highly skilled Data Engineer with 4 to 8 years of experience, specializing in the Microsoft Azure ecosystem. The ideal candidate will have a minimum of 3 years of core experience in MS Azure and Databricks. This role requires deep expertise in developing scalable, real-time data platforms and pipelines using a security-first approach, leveraging technologies like Python, PySpark, and Spark Structured Streaming within the Azure cloud environment.
Key Responsibilities :
- Azure & Databricks Development : Architect, design, and develop end-to-end cloud-based analytics solutions and data platforms (Data Lakes/Lakehouses), with a primary focus on MS Azure and Databricks Delta Lakehouse architecture.
- Real-Time Data Workflows : Design and develop robust real-time data workflows and pipelines using (Py)Spark Structured Streaming solutions to handle continuous data ingestion and processing.
- ETL/ELT & Programming : Build and optimize efficient ETL or ELT processes. You must maintain expert-level proficiency in mandatory programming skills : Python, PySpark, and SQL.
- Data Modeling : Apply expertise in Data modeling principles, particularly with RDBMS systems, and hands-on experience with the PostgreSQL database.
- DevOps & Security : Architect and design data pipelines using CI/CD methodology. You must be comfortable and proficient in adopting security-first development principles throughout the entire data engineering lifecycle.
Qualifications :
- Experience : 4 to 8 years of professional experience in Data Engineering.
- Core Cloud Experience : Minimum of 3 years of core experience in MS Azure and Databricks.
- Mandatory Technical Stack : Expert proficiency in Databricks, Python, PySpark, Spark Streaming, and SQL.
- Architecture : Experienced in the architecture, design, and development of Data platforms / Datalakes, preferably using the Databricks Delta Lakehouse approach.
- Data Flow : Proven experience with the architecture and design of real-time data workflows using Structured Streaming.
- Modeling : Expertise in Data modelling with RDBMS systems, specifically PostgreSQL database.
- Process : Experience in designing data pipelines using CI/CD methodology.
- Soft Skills : Strong development and problem-solving skills, with a commitment to secure development practices.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1552391
Interview Questions for you
View All