Posted on: 24/10/2025
Description :
We are actively seeking an experienced and passionate Senior Snowflake + Python Developer to join our growing data engineering team. You will be responsible for designing, building, and optimizing scalable, high-performance data pipelines and data warehouse solutions leveraging the modern cloud capabilities of Snowflake and the flexibility of Python/PySpark.
This role requires deep technical expertise, especially in data warehousing, complex SQL, ETL concepts, and performance tuning.
Key Details :
- Detail: Specification
- Position: Snowflake + Python Developer
- Experience: 6 to 12 Years
- Interview Mode: Face-to-Face Drive (Mandatory)
- Notice Period: Immediate Joiner (Candidates with a maximum of a 15-day notice period may also be considered)
- Location: Bengaluru, Karnataka, India
Mandatory Face-to-Face Interview Drive Details :
- Detail: Specification
- Date: 25th October (Friday)
- Time: 9:30 AM to 3:00 PM
- Venue: 11th Floor, CR2, Prestige Shantiniketan, Whitefield Road, Bengaluru, Karnataka, India
Core Responsibilities :
- Data Pipeline Development: Design, develop, and maintain robust, scalable ETL/ELT pipelines for ingesting and transforming high-volume data using Python/PySpark and native Snowflake capabilities.
- Snowflake Architecture: Implement best practices for data modeling, performance optimization, security, and resource management within the Snowflake Data Cloud.
- SQL & Data Logic: Write, optimize, and manage complex stored procedures (in Snowflake SQL or Python/Snowflake Scripting), views, and functions to enforce business logic and transformations.
- Legacy Integration: Leverage skills in Oracle and PL/SQL to manage data migration and integration with existing enterprise systems.
- Automation & Scripting: Utilize Unix Shell Scripting to automate job scheduling, monitoring, file processing, and environment setup tasks.
- Performance Tuning: Proactively identify and resolve performance bottlenecks across Snowflake queries, ETL jobs, and data models to ensure efficient data delivery.
- Collaboration: Work closely with Data Architects, BI Developers, and business stakeholders to translate technical and functional requirements into efficient data solutions.
Required Technical Skills :
- Experience Benchmark: 57+ years of dedicated experience in Data Warehouse (DWH), ETL, and Business Intelligence (BI) projects.
- Snowflake Expertise: Proven hands-on experience with the Snowflake Database, including working with SnowSQL, understanding micro-partitions, virtual warehouses, roles, and resource monitors.
- Python/PySpark: Strong, demonstrable expertise in Python and/or PySpark for data processing, ETL automation, and building data transformation frameworks.
- Database Proficiency: Strong command of traditional databases, including Oracle and development experience using PL/SQL.
- Scripting: Proficiency in Unix Shell Scripting for file handling, job orchestration, and utility tasks.
- DWH & ETL Concepts: Deep understanding of modern data warehousing principles, data modeling (Star/Snowflake schema), and complex ETL methodologies.
- Troubleshooting: Exceptional ability to troubleshoot complex data flow issues, job failures, and perform performance tuning on SQL queries and ETL jobs.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1564069
Interview Questions for you
View All