Posted on: 10/10/2025
Description :
Key Responsibilities :
- Design, develop, and maintain scalable data solutions using Snowflake.
- Implement and optimize ETL/ELT pipelines for large-scale data processing.
- Collaborate with data architects and analysts to define data models and warehouse structures.
- Ensure data quality, integrity, and security across all stages of the data lifecycle.
- Integrate Snowflake with various cloud platforms (e.g., AWS, Azure, GCP).
- Monitor performance and troubleshoot issues in Snowflake environments.
- Document technical specifications and maintain best practices.
Required Skills & Qualifications :
- 4+ years of experience in data engineering or data warehousing.
- Strong expertise in Snowflake, including performance tuning and advanced features.
- Proficiency in SQL for data manipulation, transformation, and analysis.
- Solid understanding of data warehouse architecture and dimensional modeling.
- Experience with cloud platforms (AWS, Azure, or GCP).
- Familiarity with tools like dbt, Airflow, Informatica, or Talend is a plus.
- Excellent problem-solving and communication skills
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1558095
Interview Questions for you
View All