Posted on: 07/04/2026
Description :
Experience : 7+ Years
Location : Pune (Hybrid)
Looking for Immediate joiners only
- Hands-on experience with Snowflake and Python to design, build, and support end-to-end data engineering solutions.
- Proven delivery experience in R&D data product creation, including requirement understanding, data modeling, transformation logic, and analytics-ready outputs.
- Strong techno-functional skills to translate business/R&D needs into scalable ELT pipelines, ensuring data quality, performance, and usability.
- Demonstrated project delivery and coordination skills, including task planning, dependency management, stakeholder communication, and on-time execution.
Key Responsibilities :
- Design and implement scalable and efficient data pipelines using Snowflake, AWS services, and PySpark to ingest, transform, and load data from various sources.
- Develop and maintain data models and data warehousing solutions on Snowflake to support business intelligence and analytics requirements.
- Build and deploy data quality checks and monitoring systems to ensure data accuracy and reliability for stakeholders.
- Collaborate with data scientists and analysts to understand their data needs and provide them with the necessary data infrastructure and tools to perform their analyses.
- Optimize Snowflake query performance and resource utilization to ensure efficient data processing and reporting.
- Implement data security and governance policies to protect sensitive data and ensure compliance with regulatory requirements.
- Utilize Data Build Tool (DBT) to transform data in the warehouse following software engineering best practices.
Required Skillset :
- Demonstrated ability to design, develop, and maintain data warehousing solutions using Snowflake DB.
- Proven experience in building data pipelines using AWS services (e.g., S3, EC2, Lambda) and PySpark.
- Strong proficiency in SQL and data modeling techniques.
- Hands-on experience with Data Build Tool (DBT) for data transformation and version control.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Ability to adapt to a fast-paced and dynamic work environment.
- Experience with data governance and security best practices.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1626671