Posted on: 16/11/2025
Description :
In this role, youll play a pivotal role in designing and implementing enterprise data models optimized for both BI and AI workloads. The position reports directly to our Technical Director, IT.
Our Info Architecture & BI Engineering team is dedicated to enabling Data and AI driven decision making, and your contributions will be instrumental in preparing curated datasets and tools to deliver actionable insights and AI capabilities.
We offer a flexible work environment, embracing a hybrid approach for most office-based roles. Employees are encouraged to spend an average of at least three days per week onsite, allowing for two days of remote work.
Responsibilities :
Data Modeling & Architecture
- Design and maintain scalable data models for BI and AI workloads.
- Implement semantic layers and data Lakehouse structures to support cross-domain analytics.
- Maintain data dictionaries and documentation for business-friendly data access.
Pipeline Development :
- Build and optimize ETL/ELT pipelines for structured, semi-structured, and unstructured data.
- Ensure pipelines support both analytics and machine learning feature extraction.
Data Quality & Governance :
- Apply data validation, lineage tracking, and cataloging standards.
- Maintain metadata and catalog for AI-ready datasets.
- Maintain compliance with data privacy and security regulations.
Performance Optimization :
- Tune queries and storage for BI dashboards and AI inference.
- Monitor and optimize cost and performance across workloads.
Integration & Enablement :
- Integrate BI tools (Power BI) with AI platforms.
- Support AI model deployment by providing curated datasets and features.
Collaboration :
- Work closely with other BI team members and business users to deliver quality solutions.
- Document processes and provide enablement for self-service analytics.
Requirements/Qualifications :
Experience :
- 4 to 7 years in data engineering or analytics with hands-on delivery of dimensional models and production-grade ETL/ELT pipelines.
- Strong experience with enabling Copilot for Power BI and self-serve BI by preparing and curating datasets optimized for natural language queries, ensuring semantic consistency and user-friendly metadata.
- Strong experience with Microsoft Fabric for modern data architecture, including Lakehouse, Dataflows Gen2, and OneLake integration.
Technical Skills :
- Strong SQL and dimensional modeling expertise optimized for BI & AI performance and usability.
- Hands-on with preparing model-ready datasets and feature tables for ML and GenAI.
- Experience with Delta/Parquet, Lakehouse architecture, OneLake integration, feature stores and vector databases.
- Experience working on Power BI, Fabric/Azure.
- Exposure to CI/CD for data pipelines and Git-based workflows.
- Understanding of CDC and incremental models, watermarking and idempotency.
Soft Skills :
- Strong problem-solving and communication skills.
- Ability to collaborate across teams to achieve shared goals.
Did you find something suspicious?
Posted By
Posted in
Data Analytics & BI
Functional Area
Data Analysis / Business Analysis
Job Code
1575549
Interview Questions for you
View All