Posted on: 23/11/2025
Description :
Role Summary :
We are seeking a highly skilled Data Engineering professional with strong expertise across SQL, Data Warehousing, ETL, BI, DevOps, and CI/CD, and proven experience in core banking system implementations and regulatory reporting data pipelines. The ideal candidate will be hands-on with modern and traditional ETL tools, messaging platforms, big-data processing frameworks, and reporting tools. Experience working in banks or financial institutions is essential.
Key Responsibilities :
1. Data Engineering & ETL :
- Design, build, and optimize ETL workflows for data extraction from Core Banking Systems (CBS) for regulatory and analytics use cases.
- Develop ingestion pipelines using Apache NiFi, Airflow, SSIS, Informatica, Python, PySpark, and Databricks.
- Integrate streaming/messaging frameworks using Kafka and RabbitMQ.
- Ensure high availability, performance, and data reliability across all pipelines.
2. Data Warehousing & Modelling :
- Build and manage enterprise data warehouse layers (staging, integration, semantic).
- Perform dimensional modelling, factdimension design, and data vault modelling where applicable.
- Optimize schema designs for performance and scalability across SQL and big-data environments.
3. SQL & Database Development :
- Write highly optimized SQL queries and stored procedures for data transformation and validation.
- Work extensively with SQL Server, MySQL, and Oracle.
- Conduct performance tuning, indexing strategies, and query plan optimization.
4. Big Data & Distributed Processing :
- Work with Databricks, PySpark, and distributed data processing frameworks.
- Build scalable data transformation pipelines for large datasets.
5. BI, Reporting & Visualization :
- Support the BI layer by building clean, reliable datasets for regulatory and management reporting.
- Develop dashboards and visualizations using Power BI.
- Work with reporting teams to ensure data accuracy and alignment with end-user requirements.
6. DevOps & CI/CD :
- Implement CI/CD pipelines using tools like Azure DevOps, Git, Jenkins, or equivalent.
- Manage automated deployments, version control, testing, and environment management.
- Ensure secure and efficient deployment of ETL and data engineering components.
7. Banking Domain Expertise
- Work closely with finance, risk, compliance, and operations teams.
- Experience with core banking systems (T24, Finacle, Flexcube, etc.) is mandatory.
- Develop and support data pipelines for regulatory reporting (CBUAE, SAMA, QCB, BASEL, IFRS, ALM, liquidity, credit risk, etc.).
Required Skills :
- Databases : SQL Server, Oracle, MySQL
- ETL Tools : Apache NiFi, Airflow, SSIS, Informatica
- Big Data & Engines : Databricks, PySpark
- Messaging : Kafka, RabbitMQ
- Programming : Python (must), Java (preferred)
- Reporting : Power BI
- Data Warehousing : Star schema, snowflake schema, dimensional modelling
- DevOps/CI/CD : Git, Jenkins, Azure DevOps or similar
- NoSQL/Graph DBs : Nice to have (MongoDB, Neo4j, etc.)
- Banking Domain : Core banking system data, regulatory reporting
Experience :
- 47 years of professional experience in data engineering, ETL, BI, DevOps, and SQL development.
- Mandatory experience working with banks, core banking systems, and regulatory reporting frameworks.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1579132
Interview Questions for you
View All