HamburgerMenu
hirist

Snowflake Data Engineer - PySpark/ETL

Jobtravia Pvt. Ltd.
Multiple Locations
5 - 7 Years

Posted on: 26/08/2025

Job Description

Job Title : Snowflake Data Engineer

Location : Pune, Hyderabad | Hybrid

Experience : 5+ Years

About the Role :

We are looking for a Snowflake Data Engineer with strong expertise in data architecture, pipeline development, and cloud-native solutions. Youll play a key role in designing, optimizing, and maintaining scalable data platforms that transform raw data into actionable insights. If you enjoy solving complex data challenges and want to work with cutting-edge cloud technologies, this role is for you.

Key Responsibilities :

Data Engineering & Pipeline Development :

- Design, build, and maintain scalable, efficient, and secure data pipelines.

- Implement ETL/ELT processes leveraging Snowflake and Python/PySpark.

Data Modeling & Architecture :

- Develop and optimize data models, schemas, and warehouses to support analytics and BI.

- Ensure data integrity, performance tuning, and compliance with governance standards.

Snowflake Expertise :

- Leverage advanced Snowflake features : Snowpipe, Streams, Tasks, Time Travel, Data Sharing.

- Optimize query performance and manage warehouses, roles, and security policies.

Cloud & Integration :

- Work across AWS, Azure, or GCP environments for data storage, processing, and orchestration.

- Integrate Snowflake with third-party tools (Airflow, DBT, Kafka, BI tools).

Collaboration & Delivery :

- Partner with data scientists, BI teams, and business stakeholders to deliver data-driven insights.

- Contribute to client presentations, solution proposals, and business development activities.

What You Bring :

- 5+ years of experience in Snowflake-based data engineering with strong Python/PySpark expertise.

- Solid experience with ETL/ELT pipeline design, data modeling, and cloud-native data solutions.

- Hands-on knowledge of AWS, Azure, or GCP cloud services.

- Strong understanding of SQL, performance tuning, and data security best practices.

- Excellent communication, leadership, and stakeholder management skills.

- Analytical mindset with adaptability to fast-paced and evolving environments.

Preferred Qualifications :

- Experience with orchestration tools (Airflow, Prefect, DBT).

- Knowledge of CI/CD practices and DevOps for data.

- Exposure to real-time streaming data (Kafka, Kinesis, Pub/Sub).

- Certifications in Snowflake, AWS, Azure, or GCP.


info-icon

Did you find something suspicious?