Posted on: 21/08/2025
Job Description :
Responsibilities :
- Build and optimize data models in Snowflake for analytics and reporting.
- Implement and manage transformation workflows using DBT.
- Work with AWS Glue, Snaplogic, and Fivetran to integrate data from multiple sources into the data warehouse.
- Ensure data quality, consistency, and governance across pipelines.
- Collaborate with analytics, product, and business teams to understand requirements and deliver data solutions.
- Monitor pipeline performance and optimize for cost and efficiency.
- Mentor junior data engineers and contribute to best practices, standards, and automation initiatives.
Requirements :
- Minimum 4+ years of experience in Data Engineering.
- Strong expertise in Snowflake (data modeling, performance tuning, security).
- Hands-on experience with DBT for data transformations.
- Proven experience with AWS Glue, Snaplogic, Fivetran, and other ETL tools.
- Strong SQL and Python skills for ETL development and automation.
- Experience in building and optimizing large-scale data pipelines.
- Knowledge of data governance, security, and compliance best practices.
- Excellent problem-solving, communication, and leadership skills.
Did you find something suspicious?
Posted By
Naveen
HR - Associate at DECISION POINT PRIVATE LIMITED
Last Active: NA as recruiter has posted this job through third party tool.
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1533349
Interview Questions for you
View All