HamburgerMenu
hirist

Job Description

Description :

What You'll Do :

- Design, develop, and maintain scalable, reliable, and high-performance data pipelines using Azure Data Services.

- Build and optimise ETL/ELT pipelines using Azure Databricks (PySpark).

- Ingest, transform, and process structured and unstructured data from multiple data sources.

- Implement data models to support analytics, reporting, and machine learning use cases.

- Ensure data quality, data validation, and governance standards are met.

- Optimise performance and cost for large-scale data processing workloads.

- Integrate data pipelines with downstream systems such as Power BI, data science models, and enterprise applications.

- Collaborate with cross-functional teams, including Data Science, Analytics, DevOps, and Product.

- Implement CI/CD pipelines for data workloads and follow DevOps best practices.

- Support production deployments and troubleshoot data pipeline issues.

What You Know:

Cloud & Data Platforms:

- 10+ Years Strong experience with Microsoft Azure, including:

- Azure Data Lake Storage (ADLS Gen2).

- Azure Data Factory (ADF).

- Azure Synapse Analytics.

- Hands-on experience with Azure Databricks.

Programming & Frameworks:

- Strong proficiency in Python for data engineering.

- Hands-on experience with PySpark / Spark SQL.

- Experience working with notebooks, jobs, and workflows in Databricks.

Databases & Storage:

- Experience with relational databases (Azure SQL, SQL Server, PostgreSQL).

- Experience with NoSQL / big data stores (Delta Lake, Cosmos DB preferred).

- Strong SQL skills for data transformation and optimisation.

Data Engineering Concepts:

- Strong understanding of data warehousing, lakehouse architecture, and distributed systems.

- Experience implementing Delta Lake, partitioning, indexing, and performance tuning.

- Knowledge of batch and near-real-time data processing.

DevOps & Engineering Practices:

- Experience with CI/CD pipelines (Azure DevOps preferred).

- Familiarity with version control (Git).

- Exposure to infrastructure as code (ARM / Terraform good to have).

- Understanding of monitoring, logging, and alerting for data platforms.

Good to Have:

- Experience with streaming platforms (Kafka, Azure Event Hubs, or Spark Structured Streaming).

- Exposure to data governance, security, and compliance frameworks.

- Experience supporting ML pipelines and feature engineering.

- Power BI or other BI tool integration experience.

Education:

- Bachelors degree in computer science, Information Systems, Engineering, Computer Applications, or related field.

Benefits:

- In addition to competitive salaries and benefits packages, Nisum India offers its employees some unique and fun extras:

- Continuous Learning Year-round training sessions are offered as part of skill enhancement certifications sponsored by the company on an as-needed basis.

- We support our team to excel in their field.

- Parental Medical Insurance Nisum believes our team is the heart of our business, and we want to make sure to take care of the heart of theirs.

- We offer opt-in parental medical insurance in addition to our medical benefits.

- Activities -From the Nisum Premier League's cricket tournaments to hosting a Hack-a-thon, Nisum employees can participate in a variety of team-building activities, such as skits, dance performance in addition to festival celebrations.

- Free Meals Free snacks and dinner are provided daily, in addition to subsidised lunch.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in