Posted on: 17/11/2025
Required Technical Skill Set :
- DBT
- PL/SQL
- Azure/AWS Overall Knowledge
- Knowledge on DB Modelling
- Knowledge on Data Warehouse concepts
- Well versed with Agile Delivery
Desired Competencies (Technical/Behavioral Competency) :
Good-to-Have :
- Informatica/SSIS/ADF
Responsibility of / Expectations from the Role :
- Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
- Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
- Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
- Solid MS SQL Server skills including reporting experience.
- Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
- Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
- Ensuring root cause resolution to identified problems.
- Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
- Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
- Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
- Follow best practices and standards around data governance, security and privacy.
- Comfortable working in a fast-paced team environment coordinating multiple projects.
- Effective software development life cycle management skills and experience with GitHub
- Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
- Data transformation and Data Analytics Documentation
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1575609
Interview Questions for you
View All