HamburgerMenu
hirist

Senior Data Engineer - Delta Lake

Eqaim Technology & Services
Multiple Locations
7 - 10 Years

Posted on: 21/10/2025

Job Description

Description :


About the Project :

We are developing a data-driven platform designed to help companies optimize promotional activities for maximum business impact.

The solution collects and validates data, analyzes promotion effectiveness, plans promotional calendars, and integrates seamlessly with existing enterprise systems.

By leveraging machine learning, the platform enhances vendor collaboration, enables better negotiation, and empowers organizations to make informed, ROI-driven decisions.

Project Phase : Ongoing Team : Large, cross-functional team with diverse technical and business roles.



Key Responsibilities :



- Design, build, and optimize scalable and reusable data pipelines using Databricks, Spark, and SQL

- Develop and orchestrate automated ETL workflows using Airflow, Azure Data Factory (ADF), and CI/CD pipelines

- Implement and manage Delta Lake components including Delta Live Tables, Unity Catalog, and Delta Sharing

- Ensure data quality, consistency, and compliance with governance and security standards

- Collaborate with stakeholders to translate business requirements into robust technical solutions

- Optimize data systems for performance, reliability, and cost-efficiency in cloud environments

- Apply software engineering best practices including testing, modularization, refactoring, and automation

- Evaluate and implement new tools to improve pipeline automation, monitoring, and efficiency

- Lead data engineering initiatives and mentor junior team members



Required Skills & Experience :


- Bachelors or Masters degree in Computer Science, Engineering, or related field

- 7+ years of hands-on experience in Data Engineering with strong Big Data expertise

- Deep expertise with Databricks, Spark, and SQL (including Spark SQL)

- Strong background in data modeling and ETL development for large-scale datasets

- Proven experience with Delta Live Tables, Unity Catalog, Delta Sharing, and SQL Warehouse (server & serverless)

- Proficiency in Python for data engineering workflows

- Experience orchestrating pipelines using Airflow and implementing CI/CD with tools like Azure DevOps or GitHub Actions

- Familiarity with analytics engineering and DBT

- Knowledge of data governance, compliance, and security in cloud-based environments

- Exposure to AI/ML frameworks for workflow optimization

- Excellent communication, problem-solving, and leadership abilities



Technology Stack :


- Core : Databricks, SQL, Spark, Python, Delta Lake (Delta Live Tables, Unity Catalog, Delta Sharing)

- Orchestration : Airflow, Azure Data Factory (ADF)

- CI/CD : Azure DevOps / GitHub Actions (or equivalent)

- Cloud Platform : Azure (preferred)

- Analytics & BI : DBT and related frameworks


info-icon

Did you find something suspicious?