Posted on: 23/09/2025
Role Overview :
The ideal candidate will have a strong command of SQL, a sound understanding of Data Vault 2.0 (DV2.0) methodology, and familiarity with Enterprise Data Management (EDM) principles.
This role requires designing scalable workflows and implementing CI/CD pipelines to ensure continuous delivery of high-quality data solutions.
Key Responsibilities :
and modeling workflows.
- Utilize Snowflake as the data platform, optimizing queries and storage for performance and cost efficiency.
- Apply Data Vault 2.0 methodology to build maintainable, auditable, and scalable data models.
- Implement Enterprise Data Management (EDM) best practices within data engineering workflows.
- Design, build, and manage end-to-end scalable data workflows integrating CI/CD principles
and tools for automation and testing.
- Collaborate closely with data architects, analysts, and business stakeholders to ensure
pipelines meet data quality and governance standards.
- Monitor pipeline performance, troubleshoot issues, and optimize for reliability and scalability.
Required Skills and Qualifications :
- Strong proficiency in DBT for building and managing data transformations and models.
warehouse.
- Familiarity with Data Vault 2.0 modeling framework and practical application in enterprise-
scale data solutions.
- Understanding of Enterprise Data Management (EDM) concepts related to metadata
management, data quality, and lineage.
- Experience in designing scalable workflows and implementing CI/CD pipelines for data
engineering projects.
- Strong collaboration and communication skills to work effectively across technical and
business teams.
Recommended Experience :
- 3 to 6 years of experience in data engineering, BI, or analytics roles specializing in modern
cloud data platforms and architecture methodologies
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1551150
Interview Questions for you
View All