Posted on: 21/04/2026
Description :
We are seeking a skilled Data Engineer with 6+ years of experience, with strong expertise in Snowflake and dbt. The ideal candidate should have hands-on experience in building scalable cloud data warehouse solutions, developing transformation frameworks, and optimizing data models for analytics and reporting. Experience with at least one ETL/ELT tool is expected, while exposure to reporting tools will be considered a plus.
This role requires a strong foundation in modern data engineering practices, SQL development, cloud-based data transformation, and close collaboration with business and technical stakeholders.
Key Responsibilities :
- Design, develop, and maintain scalable data transformation pipelines with primary focus on Snowflake and dbt.
- Build, enhance, and support dbt models, including staging, intermediate, and marts layers, following best practices for modularity, reusability, and maintainability.
- Develop and optimize data solutions in Snowflake, including schema design, tables, views, SQL transformations, and performance tuning.
- Implement and manage ELT workflows using dbt and one ETL/ELT tool such as Matillion, Informatica, SSIS, Talend, ADF, or similar.
- Work with structured and semi-structured data from multiple source systems and integrate it into Snowflake for downstream analytics and business consumption.
- Write efficient, high-quality SQL for data transformation, validation, reconciliation, and business logic implementation.
- Create and maintain dbt tests, documentation, and lineage-aware models to support data quality and governance standards.
- Optimize Snowflake workloads for performance, scalability, and cost efficiency.
- Collaborate with business analysts, data architects, and stakeholders to understand requirements and translate them into robust data models and transformation logic.
- Support deployment, version control, and release processes for dbt and Snowflake-based solutions.
- Troubleshoot data pipeline failures, transformation issues, and performance bottlenecks in Snowflake and dbt environments.
- Participate in code reviews, technical discussions, and continuous improvement initiatives to enhance engineering standards and delivery quality.
- Ensure all data solutions are aligned with enterprise data governance, security, and compliance requirements.
Nice to Have :
- Experience with reporting and visualization tools such as Power BI, Tableau, Looker, or similar.
- Exposure to reporting data models, data marts, and analytics consumption layers.
- Experience with Python for automation or supplementary data processing.
- Familiarity with cloud platforms such as Azure, AWS, or GCP.
- Understanding of CI/CD practices for dbt and Snowflake deployments.
- Exposure to data quality frameworks, metadata management, and data governance practices.
- Experience working in Agile/Scrum delivery environments.
Required Qualifications :
- Degree in Computer Science, Information Technology, Engineering, or a related field.
- 6+ years of experience in data engineering, data warehousing, or data integration roles.
- Strong hands-on expertise in Snowflake for cloud data warehousing, SQL development, and performance optimization.
- Strong hands-on expertise in dbt, including model development, testing, documentation, and deployment.
- Strong proficiency in SQL for data transformation, data validation, and query optimization.
- Experience with at least one ETL/ELT tool such as Matillion, Informatica, SSIS, Talend, Azure Data Factory, or similar.
- Good understanding of data warehousing concepts, dimensional modeling, and ELT architecture.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1629925