Posted on: 04/08/2025
Role : Senior Data Engineer - Snowflake & SQL
Experience Required :
- Total relevant Work Experience 6 - 9 Years
- Total : 6+ years in Data Engineering, Data Warehousing, and SQL
- Snowflake : 3+ years hands-on experience
Primary Responsibilities :
- Design and implement scalable Data Warehouse solutions using Snowflake as the core platform.
- Understand the difference between Snowflake and traditional data warehouse system
- Develop and optimize complex SQL queries for analytical workloads, reporting, and data transformation.
- Build and maintain ELT/ETL pipelines using Snowflake-native features and external orchestration tools.
- Apply dimensional modeling techniques (Star/Snowflake schemas) to support BI and analytics use cases.
- Perform query profiling, performance tuning, and cost optimization in Snowflake.
- Implement and manage Snowflake objects including tables, views, materialized views, streams, tasks, UDFs and Stored procedures.
- Ensure data quality, lineage, and governance across the data lifecycle.
- Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines.
- Troubleshoot and resolve issues related to data ingestion, transformation, and query performance.
- Stay current with Snowflake features such as Time Travel, Zero Copy Cloning, Search Optimization, and Streams/Tasks.
Technical Skills :
Primary Skills (Must-Have) :
- Advanced SQL : Joins, window functions, CTEs, recursive queries, analytical functions
- Snowflake SQL & Architecture : Virtual warehouses, micro-partitions, clustering, caching, scaling, max concurrency
- Data Modeling : Star/Snowflake schemas, normalization/denormalization strategies
- Performance Tuning : Query optimization, warehouse sizing, result caching
- ETL/ELT Development : Using Snowflake-native features using stages and orchestration tools (ADF, Airflow, etc.)
- Stored Procedures & Functions : SQL and JavaScript-based scripting in Snowflake and exception handling
Secondary Skills (Good or Nice to Have) :
- Azure Data Factory (ADF) : Pipelines, triggers, ADLS Gen2, IR, Linked Services
- Python or PySpark : For data transformation and automation
- Snowpipe & Streams/Tasks : For real-time and CDC-based ingestion
- Data Security : Row-level security, masking policies, encryption hierarchy
- DevOps Tools : Git, CI/CD, Terraform (basic understanding)
Soft Skills & Methodologies :
- Strong analytical and problem-solving skills
- Excellent communication and documentation abilities
- Experience working in Agile/Scrum environments
- Ability to lead and mentor junior team members
- Comfortable working in fast-paced, multi-project environments
Education & Certifications :
- BTech or BE (any specialization)
- SnowPro Core Certification is a plus
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1524513
Interview Questions for you
View All