HamburgerMenu
hirist

Optimum InfoSystem - Data Project Manager - Databricks/Confluent Kafka

Optimum InfoSystem Pvt. Ltd.
Multiple Locations
10 - 20 Years

Posted on: 11/08/2025

Job Description

Job Summary :


We are seeking a results-driven Data Project Manager (PM) to lead data initiatives leveraging Databricks and Confluent Kafka in a regulated banking environment. The ideal candidate will have a strong background in data platforms, project governance, and financial services, and will be responsible for ensuring successful end-to-end delivery of complex data transformation initiatives aligned with business and regulatory requirements.


Key Responsibilities :


- Lead planning, execution, and delivery of enterprise data projects using Databricks and Confluent.

- Develop detailed project plans, delivery roadmaps, and work breakdown structures.

- Ensure resource allocation, budgeting, and adherence to timelines and quality standards.

- Collaborate with data engineers, architects, business analysts, and platform teams to align on project goals.

- Act as the primary liaison between business units, technology teams, and vendors.

- Facilitate regular updates, steering committee meetings, and issue/risk escalations.

- Oversee solution delivery on Databricks (for data processing, ML pipelines, analytics).

- Manage real-time data streaming pipelines via Confluent Kafka.

- Ensure alignment with data governance, security, and regulatory frameworks (e.g., GDPR, CBUAE, BCBS 239).

- Ensure all regulatory reporting data flows are compliant with local and international financial standards.

- Manage controls and audit requirements in collaboration with Compliance and Risk teams.


Required Skills & Experience :


- 7+ years of experience in Project Management within the banking or financial services sector.

- Proven experience leading data platform projects (especially Databricks and Confluent Kafka).

- Strong understanding of data architecture, data pipelines, and streaming technologies.

- Experience managing cross-functional teams (onshore/offshore).

- Strong command of Agile/Scrum and Waterfall methodologies.

- Databricks (Delta Lake, MLflow, Spark)

- Confluent Kafka (Kafka Connect, kSQL, Schema Registry)

- Azure or AWS Cloud Platforms (preferably Azure)

- Integration tools (Informatica, Data Factory), CI/CD pipelines

- Oracle ERP Implementation experience

- PMP / Prince2 / Scrum Master certification

- Familiarity with regulatory frameworks : BCBS 239, GDPR, CBUAE regulations

- Strong understanding of data governance principles (e.g., DAMA-DMBOK)


Education :


- Bachelors or Masters in Computer Science, Information Systems, Engineering, or related field.

- On-time, on-budget delivery of data initiatives

- Uptime and SLAs of data pipelines

- User satisfaction and stakeholder feedback

- Compliance with regulatory milestones


info-icon

Did you find something suspicious?