Posted on: 05/09/2025
Key Responsibilities :
- Design, develop, and maintain data pipelines for seamless data flow into Snowflake and other storage solutions.
- Implement ETL/ELT processes using DBT and orchestration tools (e.g., Apache Airflow).
- Collaborate with data analysts and stakeholders to gather requirements and deliver effective solutions.
- Optimize and monitor data workflows to ensure efficiency and high performance.
- Manage and maintain RDBMS databases (with focus on AWS RDS) ensuring reliability and security.
- Utilize Alteryx for data preparation and advanced analytics.
- Ensure data quality, integrity, and governance best practices.
- Document data architecture, pipelines, and workflows for team knowledge sharing.
Skills & Qualifications :
- 3+ years of experience in data engineering or related roles.
- Strong proficiency in Snowflake (data modeling, performance optimization).
- Hands-on experience with DBT (Core/Cloud) in production environments.
- Advanced SQL skills (complex queries, window functions, CTEs).
- Familiarity with Apache Airflow, Git, and collaborative workflows.
- Experience with AWS services (RDS, S3, Event Bridge, Lambda).
- Familiarity with Alteryx for data preparation and analytics.
- Strong understanding of RDBMS principles and data warehousing concepts.
- Excellent problem-solving, communication, and collaboration skills.
- Experience with Tableau/Power BI and exposure to machine learning frameworks is Preferred.
Shift Timings : 1 : 00 PM 10 : 00 PM IST
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1541073
Interview Questions for you
View All