Posted on: 12/02/2026
Description :
We are seeking a visionary Solution Architect to design, lead, and evolve our modern data platform. You will be the primary architect responsible for leveraging Snowflake and dbt to build a scalable, high-performance data ecosystem. This role is at the intersection of architecture, analytics engineering, and data science, requiring a leader who can translate complex business goals into robust technical frameworks while ensuring operational excellence across the entire data lifecycle.
Roles and Responsibilities :
1. Strategic Data Architecture :
- Platform Design : Architect end-to-end data solutions on Snowflake, focusing on multi-cluster warehousing, advanced security configurations (RBAC), and cost-efficient scaling.
- ELT Strategy : Lead the transition to modern ELT patterns, ensuring data flows efficiently from source to consumption layers.
- Data Modeling : Define the core data architecture, utilizing industry-standard modeling techniques (Star Schema, Snowflake Schema) to support diverse analytics needs.
2. Analytics Engineering & Governance :
- dbt Leadership : Own the dbt environment, defining standards for models, macros, snapshots, and documentation to ensure a "code-first" approach to data transformation.
- Quality Frameworks : Implement automated data quality testing and observability frameworks within dbt to ensure trust in downstream reporting.
- DevOps for Data : Oversee CI/CD pipelines for the data stack, ensuring seamless deployments, version control (Git), and rigorous code review processes.
3. Machine Learning & Advanced Analytics :
- ML Integration : Partner with Data Scientists to design and deploy ML solutions that leverage Snowpark and Snowpark ML, bringing compute directly to the data.
- Feature Engineering : Architect scalable feature stores and data pipelines specifically optimized for ML model training and inference within Snowflake.
4. Collaboration & Mentorship :
- Stakeholder Alignment : Act as the technical liaison between data engineers, business analysts, and executive stakeholders to ensure the platform meets long-term business objectives.
- Best Practices : Establish and evangelize best practices for SQL development, performance tuning, and documentation across the data organization.
Technical Requirements :
Must-Have Skills :
- Snowflake Mastery : Extensive hands-on experience with Snowflake architecture, including performance tuning (clustering, search optimization), security (masking policies, row-level security), and cost governance.
- Dbt Proficiency : Advanced experience with dbt (Core or Cloud), including complex macros, materialization strategies, and test suites.
- Cloud Architecture : Proven track record of designing data solutions in major cloud environments (AWS, Azure, or GCP).
- SQL & Modeling : Expert-level SQL skills and a deep understanding of ELT/ETL best practices.
Nice-to-Have (Preferred) :
- Data Vault 2.0 : Experience implementing Data Vault 2.0 methodologies for agile, scalable data warehousing.
- Snowpark : Hands-on experience with Snowpark (Python/Java/Scala) and external functions for advanced processing.
- Infrastructure as Code (IaC) : Familiarity with tools like Terraform for managing data infrastructure.
Success Indicators :
- Efficiency : Significant reduction in data processing latency and cloud compute costs.
- Reliability : High uptime of data pipelines and a robust suite of passing data quality tests.
- Adoption : Successful deployment of ML models into production using Snowflake-native capabilities.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1612178