Posted on: 18/11/2025
Description :
Title : Snowflake + dbt Data Architect.
Location : Kochi.
Experience : 7+ years in data engineering/architecture; 3+ years hands-on with Snowflake & dbt.
Role Summary :
Youll own the technical design and delivery of a HIPAA-sensitive Snowflake + dbt data platform for a healthcare enterprise.
Youll translate business needs into a secure, cost-efficient, maintainable data architecture and lead two developers to implement it.
Key Responsibilities :
- Lead end-to-end design of the Snowflake data platform : schema design, warehouses, resource monitors, roles, and sharing strategies.
- Define and enforce dbt architecture & conventions (models, macros, seeds, tests, CI/CD).
- Design secure ingestion patterns (Snowpipe, staged files, Fivetran/Matillion connectors) and policies for PHI handling (masking, encryption, RBAC).
- Set up governance : data cataloging strategy, lineage, auditing, and cost monitoring.
- Define performance & cost optimization plan (clustering keys, partitioning strategy, scaling policies).
- Create CI/CD pipelines for dbt + Snowflake deployments (Git, GitHub Actions / Azure DevOps / Bitbucket pipelines).
- Review code, mentor devs, perform design/code reviews, and lead situps with stakeholders (security, compliance, analytics).
- Produce runbooks, operational docs, and incident procedures.
Required Skills & Experience :
- 7+ years data engineering/architecture experience.
- 3+ years production experience with Snowflake (warehouses, Snowpipe, Tasks, Streams, Time Travel, cloning).
- 2+ years building production data transformations with dbt (models, tests, snapshots, macros).
- Strong SQL mastery; ability to write complex analytic queries and optimize them for Snowflake.
- Experience with healthcare data standards (HL7/FHIR/claims) and HIPAA compliance controls.
- Experience with cloud provider where Snowflake runs (AWS/Azure/GCP).
- Familiar with ETL/ELT tools (Fivetran/Matillion/Airbyte) and orchestration (Airflow, Prefect).
- Experience implementing RBAC, data masking, and access auditing.
Nice to have :
- Experience with data catalog tools (Collibra, Alation) or open source alternatives.
- Familiarity with dbt Cloud and/or advanced dbt packages.
- Experience with cost governance tools or usage-based optimization (Snowflake Resource Monitors).
- Prior role in healthcare or regulated enterprise.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1576789
Interview Questions for you
View All