Posted on: 13/08/2025
About Us :
LSEG (London Stock Exchange Group) is more than a diversified global financial markets infrastructure and data business.
We are dedicated, open-access partners with a commitment to excellence in delivering the services our customers expect from us.
With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs.
Its how weve contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years.
Through a comprehensive suite of trusted financial market infrastructure services and our open-access model we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity.
LSEG is headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific.
We employ 25,000 people globally, more than half located in Asia Pacific.
LSEGs ticker symbol is LSEG.
Role Description :
As a Principal Data Engineer, youll design and implement functionalities, focusing on Data Engineering tasks.
Youll be working with semi-structured data to ingest and distribute it on a Microsoft Fabric-based platform, modernizing data products and distribution channels.
Youll drive the software development lifecycle for continuous data delivery and lead the evaluation and adoption of emerging technologies.
Key Responsibilities :
- Provide partnership and support to SMEs and Tech Leads to ensure delivery on commitments
- Build and maintain secure and compliant production data processing pipelines on Microsoft fabric and azure to ingest, land and transform data to on data product.
- Ensure that data pipelines and data stores are high-performing, efficient, organized, and reliable, given a set of business requirements and constraints.
- Design, implement, monitor, and optimize data platforms to meet the data pipelines needs from functional and non-functional requirements.
- Responsible for data-related implementation tasks that include provisioning data storage services, ingesting streaming, and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, implementing required monitoring and telemetry and accessing external data sources.
- Design and operationalize large scale enterprise data solutions and applications using one or more Azure data and analytics services!
- Implement data solutions that use the following Azure services : Delta.io, Lakehouse, Fabric, Azure Cosmos DB, , Azure Data Factory, Spark, and Azure Blob storage, Microsoft Purview etc.
Skills and experience :
- Depending on seniority relevant experience in Data Platforms (in Financial Services industry), Azures PaaS/SaaS offerings (Fabric, Synapse, Purview, ADF, Azure Data Lake Storage etc.)
- Demonstrable experience in a similar role, with a focus on cloud distributed data processing platform for spark, and modern open table concept like delta/iceberg.
- Solid experience with Azure : Synapse Analytics, Data Factory, Data Lake, Databricks, Microsoft Purview, Monitor, SQL Database, SQL Managed Instance, Stream Analytics, Cosmos DB, Storage Services, ADLS, Azure Functions, Log Analytics, Serverless Architecture, ARM Templates.
- Strong proficiency in Spark, SQL, and Python/scala/Java.
- Experience in building Lakehouse architecture using open-source table formats like delta, parquet and tools like jupyter notebook.
- Strong notions of security best practices (e.g., using Azure Key Vault, IAM, RBAC, Monitor etc.
- Proficient in integrating, transforming, and consolidating data from various structured and unstructured data systems into a structure that is suitable for building analytics solutions.
- Understand the data through exploration, experience with processes related to data retention, validation, visualization, preparation, matching, fragmentation, segmentation, and enhancement.
- Ability to think strategically and operate in day-to-day delivery mode requirement of the role.
- Demonstrates ability to understand business requirements and the implications of those requirements on current and future roadmaps.
- High level understanding of Azure DevOps
- Agile development processes (SCRUM and Kanban)
- Strong communication, presentation, documentation, and interpersonal skills
- Able to self-manage and work independently in a fast-paced environment with dynamic requirements and priorities
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1528803
Interview Questions for you
View All