HamburgerMenu
hirist

Snowflake Data Engineer - Python/ETL

EGISEDGE
Delhi
6 - 7 Years

Posted on: 21/08/2025

Job Description

Responsibilities :

- Design and build scalable data pipeline architecture that can handle large volumes of data

- Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse

- Optimize and maintain the data infrastructure to ensure high availability and performance

- Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models

- Develop and maintain data models to support business needs

- Ensure data security and compliance with data governance policies

- Identify and troubleshoot data quality issues

- Automate and streamline processes related to data management

- Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture.

- Analyze the data products and requirements to align with data strategy

- Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams

- Enhance the efficiency, automation, and accuracy of existing reports

- Follow best practices in data querying and manipulation to ensure data integrity

Requirements :

- Must have 8+ years of experience as a Snowflake Data Engineer or related role

- Must have experience with Snowflake

- Strong Snowflake experience building, maintaining and documenting data pipelines

- Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features

- Strong SQL development experience including SQL queries and stored procedures

- Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic.

- Well versed in data standardization, cleansing, enrichment, and modeling

- Proficiency in one or more programming languages such as Python, Java, or C#

- Experience with cloud computing platforms such as AWS, Azure, or GCP

- Knowledge of ELT/ETL processes, data warehousing, and data modeling

- Familiarity with data security and governance best practices

- Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes

info-icon

Did you find something suspicious?