HamburgerMenu
hirist
showcase-imageshowcase-imageshowcase-image

Job Description

Description :


We are seeking an experienced Data Architect with expertise in Snowflake, dbt, Apache Airflow, and AWS to design, implement, and optimize scalable data solutions.

The ideal candidate will play a critical role in defining data architecture, governance, and best practices while collaborating with cross-functional teams to drive data-driven decision-making.

Key Responsibilities :

Data Architecture & Strategy :


- Design and implement scalable, high-performance cloud-based data architectures on AWS.

- Define data modelling standards for structured and semi-structured data in Snowflake.

- Establish data governance, security, and compliance best practices.

Data Warehousing & ETL/ELT Pipelines :

- Develop, maintain, and optimize Snowflake-based data warehouses.

- Implement dbt (Data Build Tool) for data transformation and modelling.

- Design and schedule data pipelines using Apache Airflow for orchestration.

Cloud & Infrastructure Management :

- Architect and optimize data pipelines using AWS services like S3, Glue, Lambda, and Redshift.

- Ensure cost-effective, highly available, and scalable cloud data solutions.

Collaboration & Leadership :

- Work closely with data engineers, analysts, and business stakeholders to align data solutions with business goals.

- Provide technical guidance and mentoring to the data engineering team.

Performance Optimization & Monitoring :

- Optimize query performance and data processing within Snowflake.

- Implement logging, monitoring, and alerting for pipeline reliability.

Required Skills & Qualifications :

- 10+ years of experience in data architecture, engineering, or related roles.

- Strong expertise in Snowflake, including data modeling, performance tuning, and security best practices.

- Hands-on experience with dbt for data transformations and modeling.

- Proficiency in Apache Airflow for workflow orchestration.

- Strong knowledge of AWS services (S3, Glue, Lambda, Redshift, IAM, EC2, etc.

- Experience with SQL, Python, or Spark for data processing.

- Familiarity with CI/CD pipelines, Infrastructure-as-Code (Terraform/CloudFormation) is a plus.

- Strong understanding of data governance, security, and compliance (GDPR, HIPAA, etc.

Preferred Qualifications :

- Certifications : AWS Certified Data Analytics Specialty, Snowflake SnowPro Certification, or dbt Certification.

- Experience with streaming technologies (Kafka, Kinesis) is a plus.

- Knowledge of modern data stack tools (Looker, Power BI, etc.

- Experience in OTT streaming could be added advantage.


info-icon

Did you find something suspicious?

Posted by

Job Views:  
9
Applications:  3
Recruiter Actions:  0

Functional Area

Data Engineering

Job Code

1592489