Posted on: 11/12/2025
Snowflake Data Engineer
Location : Gurgaon (5 days working in office)
Experience : 1- 2 years
Responsibilities :
- Build ETL/ELT pipelines using Snowflake Tasks, Streams, and Stored Procedures
- Design data models with clustering keys and optimize micro-partitioning
- Implement Snowpipe for real-time ingestion and external tables for data lake integration
- Develop JavaScript/SQL stored procedures and UDFs for complex transformations
- Write complex SQL queries with window functions, CTEs, and pivots
- Manage multi-cluster warehouses and implement resource monitors for cost control ( Good to have)
Requirements :
- Hands-on Snowflake development experience
- Strong Snowflake SQL and basic JavaScript stored procedures
- Working knowledge of Tasks, Streams, Stages, File Formats, Time Travel
- Experience with dbt core concepts (models, sources, tests, snapshots)
- Understanding of warehouse sizing, auto-suspend/resume, and query optimization
- Familiarity with RBAC, data types, and semi-structured data (JSON/XML)
- Basic knowledge of SnowSQL CLI and Snowflake web interface
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1588461
Interview Questions for you
View All