Posted on: 18/08/2025
Exp : 7+ years
Location : Bangalore
Job Type : 6 months contract + ext.
Overview : Who Is a Snowflake Developer ?
A Snowflake Developer specializes in designing, implementing, and managing data storage and processing solutions using the Snowflake data platform. They leverage Snowflakes cloud-native architecture, working across SQL, ETL pipelines, performance tuning, security, and collaboration with cross-functional teams.
Roles & Responsibilities :
Core Responsibilities :
- Design, develop, test, deploy, and maintain enterprise-level applications on Snowflake.
- Build, monitor, and optimize ETL/ELT processes using Snowflake and integration tools.
- Write, optimize, and tune complex SQL queries for efficient reporting and data retrieval.
- Design Snowflake architectures, including roles, schemas, and data models.
Performance & Scalability :
- Perform performance tuning, query optimization, resource monitoring, and scalability solutions.
Security & Access Control :
- Implement identity and access management, encryption, role-based controls, row-level security, and compliance measures.
Migration & Integration :
- Migrate data from on-premises or other systems to Snowflake and integrate Snowflake with BI tools (e.g., Tableau, Looker) and ETL frameworks.
Documentation & Collaboration :
- Document technical specs, data models, pipelines, and architecture; coordinate with analysts, engineers, and stakeholders.
- Review and audit data models and pipelines for improvements; support application support (UAT, bug fixing).
Additional Strategic & Soft Responsibilities :
- Own projects end-to-end - from requirement gathering to deployment, troubleshooting, and training.
- Conduct risk assessments, manage mitigation plans, and ensure data confidentiality.
- Engage in status reporting and routine team interactions.
Skills & Qualifications :
Technical Skills :
- SQL Proficiency, including complex query writing and optimization.
- Snowflake-specific capabilities : architecture understanding, features like Snowpipe, Time Travel, Zero-Copy Cloning, virtual warehouses.
- ETL/ELT knowledge and experience with tools like Informatica, Fivetran, DBT, Talend, etc.
- Data Warehousing & Modeling including star/snowflake schemas, normalization, and metadata management.
- Cloud Platform Familiarity (AWS, Azure, GCP; including services like S3, Lambda, EC2).
- Performance Tuning techniques, resource monitoring, cost optimization (auto-suspend/resume, clustering).
- Security & Compliance, including encryption, access controls, data masking.
- Scripting & Automation using Python, JavaScript, PySpark, Tasks, Stored Procedures.
Soft Skills & Domain Knowledge :
- Strong analytical thinking, problem-solving, and attention to detail.
- Excellent communication, collaboration, project management, and ownership capabilities.
- Industry/domain experience (e.g., finance, telecom) can be beneficial.
Optional but beneficial :
- Formal Degrees in Computer Science, Data Engineering, or related fields.
- Certifications (SnowPro, AWS/Azure/GCP, SQL, Data Warehousing).
Job Summary :
We are looking for a skilled Snowflake Developer with hands-on experience in Python, SQL, and Snowpark to join our data engineering team. You will be responsible for designing and building scalable data pipelines, developing Snowpark-based data applications, and enabling advanced analytics solutions on the Snowflake Data Cloud platform.
Key Responsibilities :
- Develop and maintain robust, scalable, and high-performance data pipelines using Snowflake SQL, Python, and Snowpark.
- Use Snowpark (Python API) to build data engineering and data science workflows within the Snowflake environment.
- Perform advanced data transformation, modeling, and optimization to support business reporting and analytics.
- Tune queries and warehouse usage for cost and performance optimization.
- Leverage Azure data services for data ingestion, orchestration, observability etc.
- Implement best practices for data governance, security, and data quality within Snowflake.
Required Skills :
- 7+ years of hands-on experience with Snowflake development and administration.
- Strong command of SQL for complex queries, data modeling, and transformations.
- Proficient in Python, especially in the context of data engineering and Snowpark usage.
- Working experience with Snowpark for building data pipelines or analytics applications.
- Understanding of data warehouse architecture, ELT/ETL processes, and cloud data platforms.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
Preferred Qualifications :
- Experience with cloud platforms like AWS, Azure, or GCP.
- Knowledge of orchestration tools such as Airflow or dbt.
- Familiarity with data security and role-based access control (RBAC) in Snowflake.
- Snowflake certifications are a plus.
- DBT tool knowledge
Soft Skills :
- Strong analytical and problem-solving capabilities.
- Ability to work independently and in a collaborative team environment.
- Excellent communication and documentation skills.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1531492
Interview Questions for you
View All