Posted on: 09/01/2026
Description :
Key Responsibilities :
- Design, develop, and optimise data pipelines and ETL processes using Snowflake.
- Implement data modelling, schema design, and performance tuning for large-scale datasets.
- Collaborate with Data Scientists and Analysts to ensure seamless data availability and integrity.
- Develop and maintain secure data sharing and governance within Snowflake.
- Integrate Snowflake with BI tools and other data platforms for reporting and analytics.
- Monitor and troubleshoot data workflows, ensuring high availability and reliability.
- Document processes, standards, and best practices for data engineering and Snowflake usage.
Required Skills & Qualifications :
- Strong expertise in Snowflake Data Warehouse architecture and features.
- Proficiency in SQL for complex queries and optimisation.
- Hands-on experience with ETL tools (e.g., Informatica, Talend, or similar).
- Knowledge of Python or other scripting languages for data transformation.
- Familiarity with Cloud platforms (AWS, Azure, or GCP) and their integration with Snowflake.
- Understanding of data modelling, data governance, and security best practices.
- Excellent problem-solving and analytical skills.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1599335