Posted on: 17/11/2025
Job Description :
Primary Responsibilities :
- Architect, design, and implement end-to-end data solutions using Snowflake.
- Lead data modeling, schema design, and performance optimization efforts.
- Develop scalable ETL/ELT pipelines using Python/Java/Scala, SQL, and modern data integration tools.
- Define and enforce data governance, quality frameworks, and DataOps practices.
- Work with cross-functional teams to translate business needs into technical architectures.
- Implement real-time streaming pipelines and Change Data Capture (CDC) frameworks.
- Oversee Snowflake best practices, security, cost optimization, and workload management.
- Collaborate with cloud teams on deployments across AWS/Azure/GCP.
Required Skills & Experience :
- 10+ years of experience in data architecture, data engineering, or similar roles.
- Strong hands-on expertise in Snowflake architecture, features, and advanced capabilities.
- Proven experience with DBT, data modeling, and designing scalable schemas.
- Deep understanding of cloud data warehousing concepts and Snowflake best practices.
- Strong programming skills in Python, Java, or Scala.
- Expertise in SQL, ETL pipelines, and data integration techniques.
- Familiarity with cloud platforms like AWS, Azure, or GCP.
- Experience implementing Data Governance and DataOps frameworks.
- Knowledge of real-time streaming tools and CDC mechanisms.
- Demonstrated ability to deliver large-scale, complex cloud-based data solutions.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1576403
Interview Questions for you
View All