HamburgerMenu
hirist

Senior Data Engineer - Snowflake/Data Build Tool

Yo Hr Consultancy
Hyderabad
6 - 12 Years

Posted on: 19/12/2025

Job Description

Title : Sr Data Engineer -Snowflake/Dbt

Experience 6 to to 12 years

Location : Hyderabad

Work Mode : This is 5 days work from office role (No Hybrid/ Remote options available)

Job Description :

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization.

Key Responsibilities :

- Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files.

- Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms.

- Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets.

- Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design.

- Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows.

- Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code.

- Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets.

- Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery.

Required Qualifications Data Engineering Skills :

- 35 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt.

- Capable of building and maintaining ELT pipelines using dbt and Snowflake with guidance on architecture and best practices.

- Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional).

- Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems.

- Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations).

- Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M.

- Experience using Git for version control and exposure to CI/CD workflows in team environments.

- Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake.

- Working knowledge of Python for basic automation and data manipulation tasks.

- Understanding of Snowflake's role-based access control (RBAC), data security features, and general data privacy practices like GDPR.

- Data Quality & Documentation

- Familiar with dbt testing and documentation practices (e.g., dbt tests, dbt docs).

- Awareness of standard data validation and monitoring techniques for reliable pipeline development.

Soft Skills & Collaboration :

- Strong problem-solving skills and ability to debug SQL and transformation logic effectively.

- Able to document work clearly and communicate technical solutions to a cross-functional team.

- Experience working in Agile settings, participating in sprints, and handling shifting priorities.

- Comfortable collaborating with analysts, data scientists, and architects across onshore/offshore teams.

- High attention to detail, proactive attitude, and adaptability in dynamic project environments.

Nice to Have :

- Experience working in client-facing or consulting roles.

- Exposure to AI/ML data pipelines or tools like feature stores and MLflow

- Familiarity with enterprise-grade data quality tools

Education :


- Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.

- Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus

Required Qualification :

- Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)


info-icon

Did you find something suspicious?

Posted by

user_img

HR

HR Manager at Yo Hr Consultancy

Last Active: 20 Dec 2025

Job Views:  
84
Applications:  15
Recruiter Actions:  10

Functional Area

Data Engineering

Job Code

1592980