HamburgerMenu
hirist

Snowflake Architect - Data Warehousing Solutions

Compunnel Technology India Private Limited
7 - 10 Years
Noida

Posted on: 27/02/2026

Job Description

Description :


We are looking for a highly skilled and experienced Snowflake Architect to design, develop, and optimize scalable cloud-based data warehouse solutions.

The ideal candidate will have strong expertise in Snowflake architecture, cloud platforms (AWS/Azure/GCP), data modeling, and modern data engineering practices.

The role involves leading data architecture initiatives, defining best practices, and collaborating with cross-functional teams to build high-performance, secure, and cost-efficient data platforms.

Key Responsibilities :


- Design and architect scalable, high-performance data warehouse solutions using Snowflake.


- Lead end-to-end implementation of Snowflake including setup, configuration, and optimization.


- Develop enterprise-level data models (Dimensional, Star/Snowflake schema).


- Design and implement ETL/ELT pipelines for structured and semi-structured data.

- Integrate Snowflake with cloud platforms such as AWS, Azure, or GCP.



- Optimize performance through clustering, partitioning, query tuning, and warehouse sizing.

- Implement data security, governance, and role-based access controls (RBAC).



- Lead data migration projects from legacy systems (Oracle, SQL Server, Teradata, etc.) to Snowflake.


- Work closely with BI teams for reporting and analytics enablement.


- Define best practices for CI/CD, DevOps, and automation in data environments.


- Mentor junior data engineers and provide architectural guidance.


- Collaborate with stakeholders to understand business requirements and translate them into scalable data solutions.

Technical Skills Required :


- Strong expertise in Snowflake architecture and administration.


- Hands-on experience with SQL (advanced query writing and optimization).


- Experience with cloud platforms (AWS / Azure / GCP).


- Knowledge of data integration tools (Informatica, Talend, Matillion, Fivetran, etc.)


- Experience with ETL/ELT frameworks and data pipeline development.


- Understanding of data modeling concepts and best practices.


- Experience in handling structured and semi-structured data (JSON, Parquet, Avro).


- Familiarity with scripting languages like Python or Shell scripting.


- Experience with version control and CI/CD tools (Git, Jenkins, Terraform, etc.)

Good to Have :


- Snowflake certification (SnowPro Core / Advanced).


- Experience in building Data Lakes and Lakehouse architecture.


- Knowledge of BI tools (Power BI, Tableau, Looker).


- Exposure to data governance frameworks and compliance standards.

Qualifications :


- Bachelors or Masters degree in Computer Science, Information Technology, or related field.


- 7 to 10 years of overall experience in Data Engineering / Data Architecture roles.

- At least 3+ years of hands-on experience in Snowflake implementations.



Key Competencies :


- Strong analytical and problem-solving skills.


- Excellent communication and stakeholder management abilities.


- Ability to work in a fast-paced, agile environment.


- Leadership skills with experience mentoring teams


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in