Posted on: 28/11/2025
Description :
About the job :
We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team.
The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures.
This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions.
Key Responsibilities :
- Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies.
- Data Integration & Pipeline Development : Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources.
- SQL Query Development & Optimization : Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis.
- Data Modeling & ELT Implementation : Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT.
- Design and optimize high-performance data architectures.
- Business Requirement Analysis : Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions.
- Troubleshooting & Data Quality : Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards.
- Collaboration & Documentation : Work closely with cross-functional teams to integrate data solutions.
- Create and maintain clear documentation for data processes, data models, and pipelines.
Skills & Qualifications :
- Expertise in Snowflake for data warehousing and ELT processes.
- Strong proficiency in SQL for relational databases and writing complex queries.
- Experience with Informatica PowerCenter for data integration and ETL development.
- Experience using Power BI for data visualization and business intelligence reporting.
- Experience with Fivetran for automated ELT pipelines.
- Familiarity with Sigma Computing, Tableau, Oracle, and DBT.
- Strong data analysis, requirement gathering, and mapping skills.
- Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP
- Experience with workflow management tools such as Airflow, Azkaban, or Luigi.
- Proficiency in Python for data processing (other languages like Java, Scala are a plus).
Education : Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1582315
Interview Questions for you
View All