Posted on: 22/01/2026
Description :
Job Summary :
- Design and develop data pipelines using Snowflake
- Build and manage Snowflake objects : tables, views, streams, tasks
- Perform data modeling and performance optimization
- Develop ELT/ETL workflows using Python and SQL
- Ingest data from multiple sources (APIs, databases, data lakes)
- Handle structured and semi-structured data (JSON, Parquet)
- Collaborate with analytics, business, and cloud teams
- Ensure data quality, security, and scalability
- Support client-facing delivery and project discussions
Mandatory Skills :
- Strong hands-on experience in Snowflake
- Advanced SQL
- Python for data engineering
- ELT/ETL pipeline development
- Cloud experience (AWS / Azure / GCP)
- Data modeling concepts
- Experience with large datasets
Good to Have :
- Snowpark (Python/Scala)
- DBT / Matillion / Fivetran
- CI/CD for data pipelines
- Cloud certifications
- Agile project experience
Candidate Profile :
- 6 - 10 years of overall data engineering experience
- Strong problem-solving and communication skills
- Experience working in client-facing or consulting environments
- Ability to work independently and as part of a team
Interview Process :
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1604987