Posted on: 25/02/2026
Job Location : Bangalore/Pune
Key Responsibilities :
- Architect, develop, and optimize the Snowflake data warehouse- and data pipelines.
- Implement data ingestion frameworks- and ELT/ETL processes- from multiple sources :
1. API-based customer platforms
2. Workday
3. Kronos
4. SAP
5. Dynamics 365 (ERP)
- Database-to-database replication
- Apply Snowflake best practices- for performance, cost optimization, and scalability.
- Ensure data quality, governance, lineage, and security across the data landscape.
- Collaborate with architects and business stakeholders to deliver reliable, analytics-ready data.
Tools & Technologies :
- Snowflake Core : Snowflake SQL, Snowpipe, Streams & Tasks, Time Travel, Clustering, and Resource Monitors
- Data Transformation & Orchestration : dbt, Airflow, Informatica, Fivetran, or similar ETL tools
- Cloud Integration : AWS (S3)- for hosting, Azure (ADLS)- for data sources
- Languages : SQL, Python
- Version Control & CI/CD : Git, Jenkins- (or equivalent)
- Monitoring : CloudWatch, Snowflake Query Profiling & Cost Optimization- tools
Qualifications :
- 5+ years of experience in Snowflake implementation (greenfield preferred).
- Strong background in data engineering, data ingestion, and pipeline design.
- Experience integrating data from APIs, ERP, HR, and operational systems.
- Strong understanding of AWS- and Azure- ecosystems.
- Expertise in Snowflake optimization, data modeling, and cost control.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1615741