HamburgerMenu
hirist

Snowflake DB Developer

EXAFLUENCE GLOBAL ANALYTICS PRIVATE LIMITED
Multiple Locations
4 - 10 Years
star-icon
3.8white-divider7+ Reviews

Posted on: 14/12/2025

Job Description

Description :


Role Overview :


We are looking for a skilled Snowflake Developer with strong capabilities in enterprise data modeling, Talend ETL development, and data pipeline performance optimization. The ideal candidate will design scalable data models, build robust ingestion/transformation pipelines, and ensure high-performance data delivery across Snowflake and Azure environments.


Required Skills :


- 4 to 8 years in data engineering or BI development.


- Strong SQL and advanced Snowflake knowledge (warehouses, micro-partitions, performance tuning).


- Hands-on expertise with Talend ETL development and job orchestration.


- Solid understanding of dimensional modeling, DWH principles, and modeling best practices.


- Experience optimizing ETL/ELT pipelines and Snowflake compute usage.


- Programming experience in Python or Java.


- Azure experience (ADF, ADLS, Functions) preferred.


Key Responsibilities :


1. Snowflake Data Modeling :


- Design and maintain enterprise data models using star schema, snowflake schema, and dimensional modeling principles.


- Translate business requirements into logical and physical Snowflake models (fact, dimension, staging, ODS).


- Implement efficient structures leveraging micro-partitions, clustering, materialized views, and schema design best practices.


- Define data standards, metadata, and documentation for long-term scalability.


2. Talend ETL Development :


- Build and maintain ETL/ELT pipelines using Talend Open Studio or Talend Cloud.


- Develop extraction, transformation, validation, and loading workflows for structured and semi-structured data.


- Implement best practices for ETL performance, parallelization, error handling, and recoverability.


- Integrate data from various sources (databases, APIs, flat files, cloud storage).


3. Data Pipeline Optimization :


- Optimize Snowflake workloads, query performance, compute cost, and storage usage.


- Tune Talend ETL jobs for speed, reliability, and efficient resource consumption.


Improve end-to-end data pipelines using techniques such as :


- Pushdown optimization


- Warehouse sizing & auto-scaling


- Partitioning & clustering strategies


- Caching and transformation tuning


- Monitor pipelines using logs, dashboards, and automated alerts.


Supporting Responsibilities :


- Automate workflows using Snowflake Tasks, Streams, and Snowpipe.


- Use Python or Java for reusable utilities, custom connectors, or data quality scripts.


- Work with Azure Data Factory, ADLS, Azure Functions for orchestration and cloud integration.


- Implement CI/CD pipelines using Git/Azure DevOps.


- Collaborate with analytics, application, and DevOps teams to deliver high-quality data products.


info-icon

Did you find something suspicious?