HamburgerMenu
hirist

Job Description

About the Role

We are seeking a skilled and results-driven Data Engineer who can combine technical expertise with strong business understanding. This role demands hands-on experience with modern Azure data stack tools, agile delivery practices, and the ability to solve real-world data challenges at scale. You will play a key role in building and optimizing scalable data pipelines, data lakes, and data transformation workflows that power business insights and AI-driven solutions.

Key Responsibilities :


- Design, develop, and maintain scalable data pipelines using Azure Data Factory and Azure Databricks.

- Manage and optimize storage solutions using Azure Data Lake Storage (Gen2) and Delta Lake.

- Write efficient and reusable code in SQL for data extraction, transformation, and loading (ETL/ELT).

- Integrate and automate workflows across data ingestion, cleansing, validation, and transformation processes.

- Work closely with data analysts, architects, and business teams to gather data requirements and deliver solutions.

- Ensure data governance, quality, and security standards are met across data platforms.

- Participate in code reviews, performance tuning, and continuous improvement of data engineering practices.

- Stay updated with emerging trends in cloud data engineering, analytics, and AI/ML integrations.

Required Skills & Experience :


- 3 - 6 years of hands-on experience as a Data Engineer or in a similar role.

Proven expertise in the Azure data ecosystem, including:

- Azure Data Factory (ADF)

- Azure Databricks

- Azure Data Lake Storage (ADLS Gen2)

- Delta Lake

- Strong command of SQL for querying, transformation, and performance optimization.

- Basic scripting skills in Python for data manipulation and automation tasks.

- Understanding of distributed data processing concepts and performance tuning.

Good to Have :


- Experience with Apache Spark for big data processing.

- Familiarity with Microsoft Fabric (MS Fabric) for integrated data workflows.

- Understanding of Data Warehouse concepts, including dimensional modeling and star/snowflake schemas.

- Exposure to Generative AI (GenAI) or AI-based data applications.

- Experience in data visualization and dashboard tools is a plus.


info-icon

Did you find something suspicious?