HamburgerMenu
hirist

Job Description

Responsibilities :

- Design, develop, and maintain robust and scalable data pipelines and ETL/ELT processes within the Snowflake data warehouse.

- Utilize DBT (Data Build Tool) extensively for data transformations, data modeling, testing, and documentation within the Snowflake environment.

- Write and optimize complex SQL queries within Snowflake to extract, transform, and load data.

- Implement and maintain data quality checks and validation processes within Snowflake and DBT.

- Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide efficient data solutions.

- Monitor and troubleshoot data pipeline performance and resolve any data-related issues within the Snowflake environment.

- Implement and enforce data governance policies and best practices within the Snowflake platform.

- Optimize Snowflake performance for cost-effectiveness and efficiency.

- Create and maintain comprehensive technical documentation for data pipelines, transformations, and data models within Snowflake and DBT.

- Adhere to best practices for Snowflake and DBT development, including version control and code management.

- Participate in code reviews and ensure code quality and adherence to standards.


Required Skillset :

Snowflake : Extensive hands-on experience (7+ years) in designing, developing, and managing data solutions on the Snowflake Data Cloud platform. This includes :

- Data loading and unloading techniques (Snowpipe, COPY INTO).

- Writing and optimizing complex SQL queries within Snowflake.

- Performance tuning and optimization of Snowflake queries and workloads.

- Understanding of Snowflake architecture, security features, and data sharing capabilities.

- Experience with Snowflake scripting and stored procedures.

DBT (Data Build Tool) : Strong proficiency (3+ years) in using DBT for :

- Building and managing data transformations within Snowflake.

- Implementing data modeling using Jinja templating.

- Writing and executing data quality tests.

- Generating documentation for data models and transformations.

- Managing DBT projects and deployments.

- SQL : Excellent proficiency in writing complex and efficient SQL queries for data manipulation, transformation, and analysis.

- Data Warehousing Concepts : Deep understanding of data warehousing principles, dimensional modeling (Star Schema, Snowflake Schema), and ETL/ELT methodologies.

- Data Quality and Governance : Experience in implementing data quality checks, validation rules, and data governance policies.

- Problem-Solving : Strong analytical and problem-solving skills with the ability to troubleshoot data-related issues effectively.

- Communication : Good verbal and written communication skills to collaborate effectively with team members and stakeholders.


Good to Have Skills :


- Familiarity with Python for scripting and automation tasks related to data pipelines and Snowflake.


- Experience with cloud platforms such as AWS, Azure, or GCP and their data integration services.

- Experience with other data integration and ETL tools.

- Knowledge of data visualization tools (e.g., Tableau, Power BI).

- Experience with version control systems (e.g., Git).

- Snowflake certifications.

- Experience working in an Agile development environment.


Qualifications :


- Bachelor's degree in Computer Science, Engineering, or a related field.

- Minimum of 6 years of experience as a Data Engineer with a strong focus on Snowflake and DBT.

- Proven experience in designing and implementing data solutions on the Snowflake platform.

- Significant hands-on experience with DBT for data transformations and modeling.

- Ability to work independently and as part of a collaborative team.

- Must be available to work on a 6-month contract.


info-icon

Did you find something suspicious?