HamburgerMenu
hirist

Job Description

Description :


- Integrating data from various sources into a unified Azure data warehouse and suitable data marts.

- Continuously monitoring and testing the availability, performance and quality of data pipelines.

- Collaborating with peers by employing and following SDLC best practice while maintaining code repositories and activity using Agile, DevOps and CI/CD methodologies through Dev, Test and QA environments

- Working closely with stakeholders to understand ongoing requirements, build effective products and align data modelling principles.

- Adhering to agreed Release and Change Management Processes.

- Troubleshoot and investigate anomalies and bugs in code or data.

- Adhering to test and reconciliation standards to produce confidence in delivery.

- Produce appropriate and comprehensive documentation to support ease of access for technical and non-technical users.

- Engage in a culture of continual process improvement and best practice.

- Efficiently respond to changing business priorities through effective time management.

- Conduct all activities and duties in line with company policy and compliantly.

- To carry out any other ad-hoc duties as requested by management.

EXPERIENCE :


- 6+ years of total experience into IT industry as a developer/senior developer/data engineer

- 4+ years of experience of working extensively with Azure services such as Azure Data Factory, Azure Synapse and Azure Datalake

- 3+ years of experience working extensively with Azure SQL, MS SQL Server and good exposure into writing complex SQL queries.

- Good knowledge and exposure into important SQL concepts such as Query optimization, Data Modelling and Data Governance

- Working Knowledge of CI/CD process using Azure DevOps and Azure Logic Apps

- Very good written and verbal communication skills

Key skills required :


- Azure Data Factory, Azure Synapse, Azure SQL, Azure SQL Datawarehouse, MS SQL Server, Azure DevOps, Azure Logic Apps, Azure Datalake

- Solid experience of ADF meta-data driven Integration runtimes, pipelines, data flows.

- Experienced in design and implementation of end-to-end ELT pipelines using Data Factory, Delta Live Tables, structured streaming

- Experience and understanding of Azure fundamentals, networking and security.

- Experience working with traditional relational Data warehouses, SQL Server or other database technologies.


info-icon

Did you find something suspicious?