HamburgerMenu
hirist

M&G Global Services - Senior Engineer - Investment Data Platform

M&G Global Services Private Limited
Multiple Locations
6 - 8 Years

Posted on: 27/08/2025

Job Description

Title : Senior Engineer - Investment Data Platform

Location : Pune/ Mumbai

Job Description :

Key accountabilities and responsibilities :

- Assisting and Contributing to the delivery of software that delivers a business outcome and is designed, implemented and tested to the team's standards as per the SDLC for the Data Platform.

- Provide technical assistance in enhancing existing data platform artefacts and infrastructure, carrying out continuous improvement.

- Providing a quality service and product to customers and stakeholders, further developing skills built through significant practical experience or training.

- Adapting, understanding and following data engineering best practise.

- Demonstrate a good sense of ownership and commitment to build efficient / scalable / extensible / robust software/data platform.

- Working within established frameworks and procedures, with the freedom to interpret them to solve a range of problems.

- Delivering outcomes that are clearly defined, using discretion over how to achieve them.

- Contributing to self and team improvement discussions.

- Participating in the development of solutions to the end consumer, being part of a high performing team comprising of requirements gathering, data modelling, data integration, software engineering, testing and release oversight.

- Building and maintaining strong relationships with key stakeholders across the business and other teams across Asset Management Technology & Change.

Knowledge & Skills (Key) :

- Demonstrate software engineering skills with experience in SQL, C# and the .Net Framework

- Knowledge of software patterns and principles.

- Working with some of the following programming and scripting languages: SQL, Python, Java, PowerShell, JavaScript.

- Working with data on Azure stack, including SQL and no-sql DBs and Azure Data factory pipelines

- A good understanding of DevOps principles and experience in building CI/CD pipeline.

- Proactively manage own delivery across parallel initiatives .

- Data exploration and analytical skills, that enable you to solve new problems and understand existing software through investigation.

- Experience in dealing with large data sets and how to analyse them.

- Good interpersonal skills, with the ability to communicate clearly and effectively, both written and orally, within a project team.

Knowledge & Skills (Desirable) :

- Experience to financial markets & asset management processes and understand analysis into a wide variety of asset classes and associated analytics (e.g. Equity, Fixed Income, Private Assets etc.)

- Experience in using Databricks for data ingestion or transformation.

Experience :

- 6+ years of total experience in software engineering.

- 3+ years of latest/recent experience in a data engineering role.

- Experience of contributing to the delivery of software in a team that utilises Azure PaaS technologies, security and delivery via Azure DevOps.

- Experience of API gateways such as Apigee or Azure API Management.

- Experience of delivering change using Agile methodologies.

- Foundational understanding of the components of Investment Data (Transactions, Positions, Instrument T&Cs, etc)

info-icon

Did you find something suspicious?