Posted on: 29/01/2026
Description :
- 8 to 15 years of Data Engineering experience, with at least 4 years operating in Snowflake.
- Proven success designing and deploying large-scale data pipelines and dimensional models using modern cloud data platforms.
- Deep understanding of Kimball and Data Vault methodologies, and how they integrate within Lakehouse and event-driven architectures.
- Advanced proficiency with SQL and Python for pipeline development, data validation, and automation.
- Experience leading data engineering projects through the full SDLC, from requirements through deployment and monitoring.
- Expertise in integrating batch and streaming data sources, including Kafka, APIs, and message queues.
- Experience with orchestration and workflow management tools such as Airflow, Matillion or dbt.
- Cloud experience with Azure and/or AWS, including cost optimization and data security best practices.
- Strong communication and leadership skills to guide cross-functional teams and present technical concepts clearly to non-technical stakeholders.
- Value-add: familiarity with vector and graph databases, Apache NiFi, and modern data sharing or clean-room technologies.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1607486