Posted on: 09/08/2025
About the Role :
We are seeking an experienced and passionate Snowflake Developer to join our data engineering team. The ideal candidate will have a solid background in data integration, transformation, and cloud data warehousing using Snowflake, along with hands-on experience in Python, SQL, and various data ingestion tools.
As a Snowflake Developer, you will be responsible for developing scalable data pipelines, optimizing data workflows, and supporting various business and analytics teams with data solutions. A strong focus on troubleshooting, automation, and performance tuning is essential.
Key Responsibilities :
- Design, develop, and optimize data pipelines and ETL/ELT workflows using Snowflake.
- Create complex SQL queries, views, and stored procedures to support reporting and analytics needs.
- Develop and maintain Python scripts for data processing, automation, and custom integrations.
- Work with various cloud storage systems such as AWS S3, SFTP, and integrate them into Snowflake pipelines.
- Manage data ingestion from structured and semi-structured sources into Snowflake.
- Implement secure data transfer mechanisms and maintain data lineage and metadata documentation.
- Proactively monitor pipelines, resolve data issues, and troubleshoot system errors using tools like JIRA and Confluence.
- Collaborate with cross-functional teams to understand business requirements and resolve data issues efficiently.
- Perform configuration and connectivity issue resolution for Snowflake and integrated tools.
- Work with JIRA for ticket tracking, Confluence for documentation, and Excel for ad-hoc data manipulations.
- Use IDP (Intelligent Data Processing) or Automic (if experienced) for workflow automation and scheduling.
Required Skills & Qualifications :
- 3+ years of hands-on experience as a Snowflake Developer.
- Strong command of SQL, Python, and Snowflake development.
- Proven experience with stored procedures, UDFs, and data transformation logic in Snowflake.
- Experience with cloud platforms (AWS preferred) and tools like S3, SFTP.
- Familiarity with data pipeline orchestration, job scheduling, and automation frameworks.
- Good understanding of data preparation, data quality, and data governance principles.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1526932
Interview Questions for you
View All