Posted on: 29/10/2025
Description :
Job Title : Snowflake + DBT Developer.
Location : Hyderabad.
Experience : 4+ years.
Job Type : Full-Time.
Department : Enterprise Data and Analytics.
About The Role :
We are looking for an experienced Snowflake + DBT Developer to join our Technology Team .
This role demands a deep understanding of data pipeline migration, modeling, and orchestration using modern data stack technologies like Snowflake, DBT, Databricks, and Airflow.
You will play a crucial role in migrating existing pipelines from Databricks to Snowflake, building scalable and performant workflows, and ensuring seamless data integration and delivery across environments.
This is a highly collaborative role requiring strong communication skills and hands-on experience with CI/CD, orchestration, monitoring, and modern data engineering best practices.
Key Responsibilities :
Migration & Architecture :
- Analyze and understand existing data pipelines built using Databricks (PySpark/Spark).
- Re-architect and rebuild pipelines for Snowflake using SQL-based transformations and DBT models.
- Ensure high performance, scalability, and data consistency during and post-migration.
Snowflake Development :
- Develop using Snowflake-native features : Snowpipe, Streams/Tasks, Dynamic Tables, Materialized Views, Secured Views, RBAC, Data Masking, Row-Level Security, etc.
- Implement performance tuning, query optimization, and cost-effective warehouse management.
- Leverage metadata layers, query history, and account usage for monitoring and auditing.
DBT (Data Build Tool) :
- Build and manage modular DBT models, macros, and materializations.
- Maintain version-controlled environments and promote changes across Dev ? Test ? Prod using CI/CD.
- Ensure documentation, testing, and validation within DBT framework.
Workflow Orchestration & Automation :
- Design, build, and manage Airflow DAGs for end-to-end orchestration.
- Integrate orchestration with tools such as Cosmos, Astronomer, or Docker.
DevOps & CI/CD :
- Create and manage automated deployment pipelines using GitHub Actions, Azure DevOps, or similar.
- Enforce coding standards, peer reviews, and Git-based version control.
Monitoring & Alerting :
- Implement proactive monitoring, logging, and alerting mechanisms for data pipelines.
- Develop self-healing or alert-driven systems to address critical failures or anomalies.
Technical Requirements :
- 4+ years of experience in data engineering with strong hands-on experience in Snowflake and DBT.
- Proven experience in Databricks (PySpark/Spark) and migrating workloads to Snowflake.
Strong Understanding Of :
- RBAC, Data Sharing, Database Replication, Stored Procedures, Snowpipe vs Streams, Authentication (MFA).
- Dynamic Tables vs Materialized Views, Secured Views, Query Optimization, Pruning, Meta Data, and Query History.
- Proficiency in SQL development, data modeling, and performance tuning in Snowflake.
- Experience with orchestration tools like Airflow and containerized deployments using Docker.
- Familiarity with CI/CD pipelines, code versioning, and deployment automation.
- Knowledge of data governance, security controls, and masking techniques.
Nice-to-Have :
- Experience with Headless BI Tools or Data Visualization Platforms.
- Familiarity with Cloud Services (AWS/Azure/GCP) and deployment on platforms like Snowflake on AWS/Azure.
- Exposure to i18n/localization in data products or multi-tenant data platforms.
Soft Skills :
- Strong analytical thinking and attention to detail.
- Excellent verbal and written communication skills.
- Ability to work independently as well as collaboratively across cross-functional teams.
- Self-driven, proactive, and capable of managing multiple tasks simultaneously.
- Willingness to mentor peers and contribute to process improvement and documentation.
- - - - - - -
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1566263
Interview Questions for you
View All