HamburgerMenu
hirist

Job Description

Description :

We are seeking a highly skilled Data Engineering professional with strong expertise across SQL, Data Warehousing, ETL, BI, DevOps, and CI/CD, and proven experience in core banking system implementations and regulatory reporting data pipelines. The ideal candidate will be hands-on with modern and traditional ETL tools, messaging platforms, big-data processing frameworks, and reporting tools. Experience working in banks or financial institutions is essential.

The core responsibilities for the job include the following :

Data Engineering and ETL :


- Design, build, and optimize ETL workflows for data extraction from Core Banking Systems (CBS) for regulatory and analytics use cases.

- Develop ingestion pipelines using Apache NiFi, Airflow, SSIS, Informatica, Python, PySpark, and Databricks.

- Integrate streaming/messaging frameworks using Kafka and RabbitMQ.

- Ensure high availability, performance, and data reliability across all pipelines.

Data Warehousing and Modelling :


- Build and manage enterprise data warehouse layers (staging, integration, semantic).

- Perform dimensional modelling, fact-dimension design, and data vault modelling where applicable.

- Optimize schema designs for performance and scalability across SQL and big-data environments.

SQL and Database Development :


- Write highly optimized SQL queries and stored procedures for data transformation and validation.

- Work extensively with SQL Server, MySQL, and Oracle.

- Conduct performance tuning, indexing strategies, and query plan optimization.

Big Data and Distributed Processing :


- Work with Databricks, PySpark, and distributed data processing frameworks.

- Build scalable data transformation pipelines for large datasets.

BI, Reporting and Visualisation :

- Support the BI layer by building clean, reliable datasets for regulatory and management reporting.

- Develop dashboards and visualizations using Power BI.

- Work with reporting teams to ensure data accuracy and alignment with end-user requirements.

DevOps and CI/CD :


- Implement CI/CD pipelines using tools like Azure DevOps, Git, Jenkins, or equivalent.

- Manage automated deployments, version control, testing, and environment management.

- Ensure secure and efficient deployment of ETL and data engineering components.

Banking Domain Expertise :

- Work closely with finance, risk, compliance, and operations teams.

- Develop and support data pipelines for regulatory reporting (CBUAE, SAMA, QCB, BASEL, IFRS, ALM, liquidity, credit risk, etc. ).

Requirements :


- Experience with core banking systems (T24 Finacle, Flexcube, etc. ) is mandatory.

- 4- 7 years of professional experience in data engineering, ETL, BI, DevOps, and SQL development.

- Mandatory experience working with banks, core banking systems, and regulatory reporting frameworks.

Required Skills :


- Databases : SQL Server, Oracle, MySQL.

- ETL Tools : Apache NiFi, Airflow, SSIS, Informatica.

- Big Data and Engines : Databricks, PySpark.

- Messaging : Kafka, RabbitMQ.

- Programming : Python (must), Java (preferred).

- Reporting : Power BI.


- Data Warehousing : Star schema, snowflake schema, dimensional modelling.

- NoSQL/Graph DBs : Nice to have (MongoDB, Neo4j, etc. ).

- Banking Domain : Core banking system data, regulatory reporting.


info-icon

Did you find something suspicious?