Posted on: 07/11/2025
Description :
What Youll Do :
- Design and implement high-quality ETL/ELT data pipelines in Snowflake using python, R or sql scripting.
- Develop and maintain data models, schemas, and warehouse structures optimized for performance and scalability.
- Integrate Snowflake with various cloud and on-prem sources via APIs, connectors, and data ingestion frameworks.
- Collaborate with Data Scientists, Analysts, and BI teams to provide reliable, timely data for reporting and analytics.
- Manage and optimize warehouse compute resources, query performance, and cost efficiency.
- Implement robust data governance and quality frameworks, including security, privacy, and compliance standards.
- Support and maintain CI/CD pipelines for data workflows (Github).
- Troubleshoot complex data issues and recommend process improvements to enhance data reliability.
What You Bring :
- Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field.
- 5-7 years of experience in Data Engineering.
- 3+ years of hands-on experience in Snowflake design, development, and optimization.
- Strong expertise in SQL, data modelling (3NF, star schema, data vault), and performance tuning.
- Proficiency in Python, R or SQL scripting for automation and data transformation.
- Experience with ETL/ELT orchestration tools (Airflow, dbt or equivalent).
- Familiarity with cloud platforms like AWS, Azure, or GCP, and their native data ecosystems.
- Understanding of data governance, access control (RBAC), and encryption in Snowflake.
- Strong analytical mindset with the ability to translate business requirements into technical solutions
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1571256
Interview Questions for you
View All