HamburgerMenu
hirist

Data Engineer - Snowflake DB/Matillion

Jhavion Consultancy
Hyderabad
3 - 5 Years

Posted on: 03/12/2025

Job Description

Description :


We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization.

Key Responsibilities :


- Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices.

- Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake.

- Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.

- Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows.

- Apply dbt best practices : modular SQL development, testing, documentation, and version control.

- Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.

- Apply CI/CD and Git-based workflows for version-controlled deployments.

- Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks.

- Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.

- Write well-documented, maintainable code using Git for version control and CI/CD processes.

- Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.

- Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions.

Required Qualifications :


- 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT or Matillion (Matillion-DPC is highly preferred, not mandatory

- Experience building and deploying DBT models in a production environment.

- Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).

- Familiarity with data quality and validation techniques : dbt tests, dbt docs etc.

- Experience with Git, CI/CD, and deployment workflows in a team setting

- Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory.

Core Competencies :


Data Engineering and ELT Development :

- Building robust and modular data pipelines using dbt.

- Writing efficient SQL for data transformation and performance tuning in Snowflake.

- Managing environments, sources, and deployment pipelines in dbt.

Cloud Data Platform Expertise :


- Strong proficiency with Snowflake : warehouse sizing, query profiling, data loading, and performance optimization.

- Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages.

Technical Toolset :


Languages & Frameworks :


- Python : For data transformation, notebook development, automation.

- SQL : Strong grasp of SQL for querying and performance tuning.

Best Practices and Standards :

- Knowledge of modern data architecture concepts including layered architecture (e.g., staging , intermediate , marts, Matillion architecture).

- Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs).


info-icon

Did you find something suspicious?