HamburgerMenu
hirist

Job Description

Role Overview :

We are seeking a talented and experienced Data Engineer to join our team. The ideal candidate will have expertise in technologies such as Metabase, Dbt, Stitch, Snowflake, Avo, and MongoDB. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to support our analytics and data-driven decision-making processes.

Responsibilities :

- Designing, developing and implementing scalable data pipelines and ETL processes using tools such as Stitch and Dbt to ingest, transform, and load data from various sources into our data warehouse ( Snowflake).

- Implement data modeling best practices and standards using Dbt to create and manage data models for reporting and analytics.

- Collaborating with cross-functional teams to understand data requirements and deliver solutions that meet business needs.

- Develop and maintain dashboards and visualizations in Metabase to enable self-service analytics and data exploration for internal teams.

- Building and optimizing ETL processes to ensure data quality and integrity.

- Optimizing data processing and storage solutions for performance, scalability and reliability, leveraging cloud based technologies.

- Implementing monitoring and alerting systems to proactively identify and address data issues.

- Implementing data quality checks and monitoring processes to ensure the accuracy, completeness, and integrity of data.

- Managing and optimizing databases (like MongoDB for performance and scalability).

- Developing and maintaining documentation, best practices, and standards for data engineering processes and workflows.

- Stay up-to-date with emerging technologies and trends in data engineering, machine learning, and analytics, and evaluate their potential impact on data strategy and architecture.

Requirements :

- Bachelor's or Master's degree in Computer Science.

- Minimum of 4 years of experience working as a data engineer with expertise in Metabase, Dbt, Stitch, Snowflake, Avo, MongoDB.

- Strong programming skills in languages like Python,and experience with SQL and database technologies (e.g., PostgreSQL, MySQL, MongoDB).

- Hands-on experience with data integration tools (e.g., Stitch), data modeling tools (e.g., Dbt), and BI platforms (e.g., Metabase).

- Experience with cloud platforms such as AWS.

- Strong understanding of data modeling concepts, database design, and data warehousing principles

- Experience with big data technologies and frameworks (e.g., Hadoop, Spark, Kafka) and cloud based data platforms (e.g., AWS EMR, Azure Databricks, Google BigQuery).

- Familiarity with data integration tools, ETL processes, and workflow orchestration tools (e.g., Apache Airflow, Apache NiFi).


info-icon

Did you find something suspicious?