HamburgerMenu
hirist

Job Description

Role Overview :

We are looking for an experienced Azure Data Engineer with a strong passion for big data, data modeling, and high-quality data pipelines. The role focuses on building scalable ETL/ELT pipelines, supporting real-time and batch cyber use cases, and enabling analytics and threat intelligence through well-governed data products.

Key Responsibilities :

- Design and implement ETL/ELT pipelines to move data across raw, trusted, and curated layers within the data lake.

- Support real-time and batch processing use cases related to cybersecurity analytics.

- Design and implement logical and physical data models to support cyber data products, including threat detection, incident response, and reporting.

- Develop high-quality, scalable code for data pipelines using Python and SQL.

- Produce and maintain comprehensive documentation for data pipelines, data models, and data quality metrics.

- Contribute to improving code standards, performance, and reliability across the data platform.

- Collaborate closely with data scientists, analysts, and business stakeholders.

- Actively participate in data quality monitoring, validation, and continuous improvement initiatives.

- Communicate technical concepts effectively to both technical and non-technical stakeholders.

Required Skills & Qualifications :


- 5-8 years of experience in data engineering or big data roles.

- Strong proficiency in SQL and Python.

- Hands-on experience with ETL/ELT frameworks and orchestration tools such as Airflow and dbt.

- Experience working with Azure-based data platforms, especially Databricks on Azure.

- Strong understanding of data lake and data warehouse architectures.

- Working knowledge of version control systems (Git).

- Experience using BI tools, preferably Tableau.

- Strong focus on data quality, performance, and scalability.

Good to Have :


- Knowledge of cybersecurity principles, data sources, and security analytics use cases.

- Experience with streaming platforms (Kafka, Event Hubs).

- Familiarity with CI/CD for data pipelines.

- Exposure to Azure services such as ADLS Gen2, Synapse, Data Factory.

info-icon

Did you find something suspicious?