Posted on: 20/11/2025
Description :
Role : Snowflake Data Architect
Experience : 9+ years
Location : Bangalore/Coimbatore
Shift : 2pm-11PM/4 PM-12 AM IST
Work mode : Hybrid/3 days a week from office
Notice : 30 days or less
About the Role :
We are seeking a highly skilled Snowflake Data Architect with expertise in DBT, combined with strong experience in Kimball dimensional modeling. The ideal candidate will design and implement scalable data pipelines, integrate diverse data sources, and build a robust data warehouse that supports business intelligence and analytics initiatives.
Key Responsibilities :
Data Integration & Extraction :
- Develop and maintain ETL/ELT pipelines
- Extract data from multiple sources including APIs, direct database connections, and flat files.
- Ensure data quality, consistency, and reliability across all ingestion processes.
Data Modeling & Warehousing :
- Design and implement Kimball-style dimensional models in Snowflake.
- Build and optimize fact tables and dimension tables to support analytical workloads.
- Collaborate with business teams to define and maintain the BUS Matrix for subject areas.
Transformation & Orchestration :
- Use DBT to develop modular, testable, and version-controlled transformations.
- Implement data quality checks and documentation within DBT workflows.
Collaboration & Governance :
- Work closely with business stakeholders to understand requirements and translate them into technical solutions.
- Ensure compliance with data governance, security, and privacy standards.
Required Skills & Qualifications :
Technical Expertise :
- Strong proficiency in Snowflake architecture and performance tuning.
- Hands-on experience with DBT, Airbyte, and Airflow.
- Solid understanding of Kimball methodology for data warehousing.
Programming & Querying :
- Advanced SQL skills and familiarity with Python for ETL scripting.
- Experience integrating data from APIs and relational databases.
Soft Skills :
- Excellent communication and collaboration skills.
- Ability to work in an agile environment and manage multiple priorities.
Preferred Qualifications :
- Experience with cloud platforms (AWS, Azure, or GCP).
- Familiarity with BI tools (e.g., Tableau, Power BI).
- Knowledge of data governance and security best practices.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1578134
Interview Questions for you
View All