HamburgerMenu
hirist

Job Description

Description :

Position overview

We are looking for a seasoned Data Architect with deep expertise in cloud-native data platforms (AWS) along with Snowflake.

The Data Architect will be responsible for designing and implementing enterprise-grade data platforms on AWS cloud using Snowflake, Python, PySpark and other cloud native tools and services. The role demands strong hands-on technical expertise in modern data engineering frameworks, data modeling, and architecture best practices, along with a solid understanding of data management, governance, and security principles.

This role will collaborate with cross-functional teams to establish scalable, high-performing, and secure data ecosystems enabling advanced analytics, AI/ML, and BI use cases.

Key Responsibilities :

- Architect, Design and implement data Lakehouse, Data Warehouse solutions on AWS and Snowflake using Medallion Architecture (Bronze/Silver/Gold layers).

- Define and implement end-to-end data pipelines and orchestration layer/ frameworks

- Design for multi-structured data (structured, semi-structured, unstructured).

- Define and develop architecture patterns for streaming, batch, and real-time data processing.

- Integrate AWS data services (S3, Lake Formation, Kinesis, Lambda) with Snowflake into enterprise solutions.

- Implement data quality, cataloging, lineage, and metadata management frameworks.

- Partner with the data governance team to enforce standards for data ownership, stewardship, and lifecycle management.

- Define policies for data security, masking, and access control aligned with organizational governance.

- Drive adoption of best practices in DataOps, DevOps, and CI/CD for data engineering

- Contribute to data strategy and roadmap creation aligned with enterprise objectives

- Partner with business stakeholders (trading, risk, compliance) to translate requirements into technical architecture.

- Provide technical leadership and guidance to engineering teams.

Required Skills & Experience :

- 15-18 years of experience in Data Engineering / Architecture, with at least 5 years in cloud-native data platforms (AWS) and Snowflake

- Strong expertise in AWS and Snowflake services : S3, Glue, Lambda, Step Functions, IAM, CloudWatch, Snowpipe etc)

- Expert-level proficiency in Snowflake schema design, performance tuning, ELT, security setup, and integration.

- Strong programming skills in Python and PySpark for data transformation and automation.

- Experience with ETL/ELT frameworks (Informatica, Matillion, DBT, Glue).

- Good exposure and understanding of modeling, metadata management, data lineage, and master data concepts.

- Good exposure of data governance frameworks (Collibra, Alation, or custom).

- Exposure and experience to streaming technologies (Kafka, Kinesis) and API-based data integrations.

- Strong understanding of security, compliance, and privacy frameworks (GDPR, HIPAA, etc.)

- Strong problem-solving and analytical mindset.

- Excellent communication and stakeholder management skills.

- Ability to lead and mentor data engineering teams.

- Self-driven, proactive, and capable of implementing technical initiatives end-to-end.

Preferred Qualifications :

- AWS Certified Data Analytics Specialty / AWS Solutions Architect certification.

- SnowPro Core / SnowPro Advanced Architect Certification.

- Experience working in multi-cloud or hybrid environments.

- Exposure to Databricks or Azure Synapse considered a plus.

Education : Bachelors or masters degree in computer science, Data Engineering, or related field.


info-icon

Did you find something suspicious?