Posted on: 09/12/2025
Description :
About the Role :
We are seeking a highly skilled Senior Data Architect with extensive experience in designing, implementing, and optimizing enterprise-grade data platforms. The ideal candidate will be responsible for defining end-to-end data architecture, enabling scalable data pipelines, modernizing legacy systems, and implementing robust data governance and DataOps practices. This role requires deep technical hands-on capability in Snowflake, ELT/ETL engineering, data modeling, and cloud-based data ecosystems.
You will collaborate closely with engineering, analytics, cloud, and product teams to architect and deliver secure, high-performance data solutions that support mission-critical business operations.
Core Responsibilities :
Data Architecture & Solution Design :
- Architect and evolve enterprise data platforms leveraging Snowflake, cloud-native storage, and distributed data processing frameworks.
- Define reference architecture, data ingestion frameworks, canonical models, and standards for scalable data pipelines.
- Lead design of multi-zone data lake and data warehouse architectures supporting batch, streaming, and CDC pipelines.
- Develop high-performing schemas including star/snowflake models, data vault, and domain-driven designs.
- Drive architectural decisions related to partitioning, clustering, micro-partition optimization, query tuning, and cost governance within Snowflake.
Data Engineering & Pipeline Development :
- Oversee and guide technical teams in building complex ETL/ELT pipelines using Python, Java, or Scala.
- Lead modernization of legacy systems and architect large-scale data migration solutions.
- Define ingestion patterns (real-time, near real-time, batch) and implement scalable, reusable data pipelines.
- Ensure high-quality code delivery aligned with CI/CD pipelines, version control, and automated testing frameworks.
DBT & Data Transformation Frameworks :
- Implement and scale DBT for transformation orchestration, data quality tests, documentation, lineage tracking, and semantic modeling.
- Establish best practices around modularity, model reuse, macros, materializations, and environment management in DBT.
Data Quality, Governance, and Security :
- Define and enforce standards for metadata management, data cataloging, lineage, and data lifecycle policies.
- Implement data governance frameworks including access control, encryption, masking, and compliance requirements.
- Collaborate with InfoSec teams to design secure data zones, identity access management, and audit mechanisms across cloud platforms.
Cloud & Platform Engineering :
- Drive cloud-native data architecture initiatives using Snowflake, AWS/Azure/GCP services, and containerized workloads.
- Optimize platform costs, storage strategies, and compute utilization across cloud environments.
- Integrate Snowflake with enterprise data platforms, APIs, Kafka streams, SAP systems, and downstream analytics tools.
Stakeholder Management & Technical Leadership :
- Act as the primary technical architect for large data programs and ensure architectural integrity across all phases.
- Guide engineering teams through code reviews, solution reviews, and architecture compliance checks.
- Collaborate with business, data science, and BI teams to shape data solutions supporting analytics and decision-making.
Required Skills & Qualifications :
- Minimum of 10+ years in data engineering, data architecture, and enterprise data platform development.
- Deep hands-on expertise in Snowflake architecture, query engineering, performance tuning, security models, and operational best practices.
- Strong experience with ETL/ELT frameworks, large data migrations, and high-volume data processing systems.
- Proficiency in Python / Java / Scala, advanced SQL, and data integration techniques.
- Expertise in data modeling, dimensional modeling, schema design, and optimization strategies.
- Strong understanding of cloud data warehousing, distributed computing, data lakes, and cloud-native storage.
- Experience implementing and managing DBT in large-scale environments.
- Strong understanding of data governance, MDM, metadata management, and DataOps practices.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1587264
Interview Questions for you
View All