Posted on: 22/04/2026
Description :
- Design and implement scalable data models using Data Vault 2.0 and Enterprise Data Modeling (EDM) principles.
- Work extensively with Snowflake, including data warehousing concepts, performance tuning, and optimization.
- Develop and maintain modular data pipelines using DBT for efficient data transformation.
- Build and manage robust ELT/ETL pipelines to support data integration and analytics use cases.
- Utilize orchestration tools like Apache Airflow to schedule, monitor, and manage workflows.
- Write optimized and complex SQL queries to handle large-scale data processing.
- Ensure data quality, consistency, and reliability across the data platform.
- Collaborate with cross-functional teams including data analysts, engineers, and business stakeholders.
- Troubleshoot data issues and continuously improve system performance and scalability.
Key Skills & Qualifications:
- Hands-on experience with Data Vault 2.0, EDM, and modern data warehousing practices.
- Strong expertise in Snowflake and performance optimization techniques.
- Practical experience with DBT for transformation and pipeline development.
- Solid understanding of ETL/ELT concepts and data engineering best practices.
- Experience with workflow orchestration tools (Airflow preferred).
- Advanced proficiency in SQL.
- Strong analytical, problem-solving, and debugging skills.
- Good communication skills with the ability to work effectively in a team environment.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1630415