Posted on: 10/04/2026
Description :
Job Summary :
- Looking for a skilled ETL developer with hands-on experience in Talend and Snowflake.
- The ideal candidate will design, develop, and optimize ETL workflows and integrate data pipelines to support intelligence and data warehousing solutions.
Key Responsibilities :
- Design, develop, and maintain ETL/ELT workflows using Talend Data Integration tools.
- Implement and optimise data pipelines to ingest, transform, and load data into Snowflake.
- Collaborate with data analysts, architects, and business stakeholders to gather requirements and translate them into technical specifications.
- Monitor data pipelines for performance and reliability, and troubleshoot issues as needed.
- Develop and maintain technical documentation related to ETL processes and data models.
- Perform data quality checks and ensure data governance policies are followed.
- Optimize Snowflake warehouse performance and storage costs through partitioning, clustering, and query tuning.
- Support the migration of legacy ETL processes to Talend/Snowflake.
Required Skills & Qualifications :
- 4+ years of experience with Talend Data Integration (preferably Talend Open Studio or Talend Enterprise).
- 2+ years of hands-on experience with Snowflake data warehouse platform.
- Proficient in SQL, data modeling, and performance tuning.
- Experience with cloud platforms (AWS, Azure, or GCP) and cloud data ecosystems.
- Strong understanding of data warehousing concepts, ETL best practices, and data governance.
- Experience in working with APIs, flat files, and various data sources/formats.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines for data workflows.
- Excellent communication and collaboration skills.
The job is for:
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1627609