Posted on: 30/03/2026
Description :
Role & Responsibilities :
Data Platform Engineering :
- Design and build scalable data pipelines and services on AWS.
- Develop and maintain Python-based data processing frameworks.
- Build ingestion pipelines integrating multiple data sources including APIs and customer data feeds.
- Develop and maintain transformation workflows using DBT.
Data Modelling and Architecture :
- Design and implement relational data models supporting analytics and operational workloads.
- Build and maintain performant schemas within AWS RDS MySQL.
- Contribute to the design of scalable, performant data models and transformation layers.
Data Processing and Transformation :
- Implement scalable transformation pipelines including data standardisation and deduplication.
Performance and Reliability :
- Optimise SQL queries and ETL workloads for performance.
- Implement data validation and quality assurance processes.
- Build observable pipelines with logging and monitoring.
- Ensure pipelines are resilient, maintainable, and scalable.
Preferred candidate profile :
Core Technical Skills :
- Strong Python development experience in data engineering environments.
- Extensive experience with AWS RDS MySQL.
- Experience using DBT for data transformation and modelling.
- Deep understanding of relational database design.
Data Engineering :
- Experience designing and building large-scale ETL/ELT pipelines.
- Experience working with high-volume datasets.
- Strong understanding of scalable data pipeline architecture.
AWS Experience :
- AWS services including AWS RDS MySQL, S3, DMS, Lambda, and Cloudwatch.
Engineering Practices :
- Experience working with Git-based development workflows.
- Ability to design maintainable and scalable engineering solutions.
Desirable :
- Experience supporting or enabling machine learning workflows.
- Experience building data deduplication or identity resolution systems.
- Experience working in multi-tenant data platforms.
Behaviours and Attributes :
- Strong systems-thinking mindset.
- Structured and reliable approach to managing data workflows.
- Proactive and curious about improving data processes through automation.
- Communicates clearly with both technical and non-technical audiences.
- Takes ownership of data quality and operational outcomes.
- Collaborative and solutions oriented.
- Committed to continuous improvement and process scalability.
Did you find something suspicious?
Posted by
Recruiter
Last Active: NA as recruiter has posted this job through third party tool.
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1624730