Posted on: 07/12/2025
Description :
We are seeking a highly skilled Data Engineering professional with strong expertise across SQL, Data Warehousing, ETL, BI, DevOps, and CI/CD, and proven experience in core banking system implementations and regulatory reporting data pipelines. The ideal candidate will be hands-on with modern and traditional ETL tools, messaging platforms, big-data processing frameworks, and reporting tools. Experience working in banks or financial institutions is essential.
The core responsibilities for the job include the following :
Data Engineering and ETL :
- Develop ingestion pipelines using Apache NiFi, Airflow, SSIS, Informatica, Python, PySpark, and Databricks.
- Integrate streaming/messaging frameworks using Kafka and RabbitMQ.
- Ensure high availability, performance, and data reliability across all pipelines.
Data Warehousing and Modelling :
- Perform dimensional modelling, fact-dimension design, and data vault modelling where applicable.
- Optimize schema designs for performance and scalability across SQL and big-data environments.
SQL and Database Development :
- Work extensively with SQL Server, MySQL, and Oracle.
- Conduct performance tuning, indexing strategies, and query plan optimization.
Big Data and Distributed Processing :
- Build scalable data transformation pipelines for large datasets.
BI, Reporting and Visualisation :
- Support the BI layer by building clean, reliable datasets for regulatory and management reporting.
- Develop dashboards and visualizations using Power BI.
- Work with reporting teams to ensure data accuracy and alignment with end-user requirements.
DevOps and CI/CD :
- Manage automated deployments, version control, testing, and environment management.
- Ensure secure and efficient deployment of ETL and data engineering components.
Banking Domain Expertise :
- Work closely with finance, risk, compliance, and operations teams.
- Develop and support data pipelines for regulatory reporting (CBUAE, SAMA, QCB, BASEL, IFRS, ALM, liquidity, credit risk, etc. ).
Requirements :
- 4- 7 years of professional experience in data engineering, ETL, BI, DevOps, and SQL development.
- Mandatory experience working with banks, core banking systems, and regulatory reporting frameworks.
Required Skills :
- ETL Tools : Apache NiFi, Airflow, SSIS, Informatica.
- Big Data and Engines : Databricks, PySpark.
- Messaging : Kafka, RabbitMQ.
- Programming : Python (must), Java (preferred).
- Reporting : Power BI.
- Data Warehousing : Star schema, snowflake schema, dimensional modelling.
- NoSQL/Graph DBs : Nice to have (MongoDB, Neo4j, etc. ).
- Banking Domain : Core banking system data, regulatory reporting.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1586218
Interview Questions for you
View All