Posted on: 26/11/2025
Description :
Position Overview :
USEReady is seeking an experienced Data Architect with deep expertise in Snowflake, Azure, and modern data engineering frameworks.
In this role, you will architect scalable data platforms, lead cloud data modernization initiatives, and guide teams in implementing best practices across data engineering, analytics engineering, and governance.
Key Responsibilities :
Cloud Data Architecture (Azure) :
- Design and implement scalable, enterprise-grade data architectures on Microsoft Azure.
- Integrate services such as Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, Azure Functions, ADLS Gen2.
- Develop secure, efficient, and resilient cloud data ecosystems.
Snowflake Platform Leadership :
- Architect, deploy, and govern the Snowflake Data Cloud.
- Design data models (Star Schema, Data Vault), configure RBAC, optimize virtual warehouses, and manage Snowflake performance & cost.
- Implement Snowflake features like Snowpipe, Streams, Tasks, and data sharing.
Modern Data Transformation (dbt) :
- Lead strategy and implementation of transformation layers using dbt (Core/Cloud).
- Define standards for dbt project structure, documentation, testing, and CI/CD integration.
Python for Data Engineering :
- Build complex ingestion pipelines, automation scripts, API integrations, and data quality frameworks in Python.
- Support analytics & ML workflows (e.g., Snowpark for Python).
Data Governance & Security :
- Define and enforce governance frameworks, security protocols, and data quality controls.
- Ensure compliance with GDPR, CCPA, and internal enterprise policies.
Technical Leadership & Collaboration :
- Mentor data engineers and devel;opers; lead architecture reviews.
- Collaborate with business teams, analysts, and product managers to deliver actionable data solutions.
- Drive innovation by evaluating emerging tools and technologies.
Required Skills :
- Snowflake (architecture, modeling, optimization, RBAC)
- Azure (ADF, ADLS Gen2, Synapse, networking, security)
- Azure Databricks
- dbt (Core/Cloud)
- CI/CD (Azure DevOps, GitHub Actions)
- Python (Pandas, PySpark, SQLAlchemy)
- Data Modeling (Kimball, Inmon, Data Vault)
- SQL expertise
- Data governance & security
- 10+ years in data engineering/architecture (3+ years as Data Architect)
Qualifications :
- Bachelors degree in Engineering, Computer Science, or related field
- Certifications in Snowflake, Azure, dbt, or cloud technologies (preferred)
Technical Requirements :
- Snowflake, Azure, ADF, Databricks, dbt, Python, CI/CD
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1581331
Interview Questions for you
View All