HamburgerMenu
hirist

Job Description

Description :

We are seeking a skilled and motivated Data Engineer with 3 to 5 years of experience to join our growing team.

The ideal candidate will have hands-on expertise in building robust, scalable data pipelines, working with modern data platforms, and enabling data-driven decision-making across the organization.

You'll work closely with data scientists, analysts, and engineering teams to build and maintain efficient data infrastructure and tooling.

Responsibilities :

- Design, develop, and maintain scalable ETL/ELT pipelines to support analytics and product use cases.

- Collaborate with data analysts, scientists, and business stakeholders to gather requirements and translate them into data solutions.

- Manage data integrations from various internal and external data sources.

- Optimize data workflows for performance, cost-efficiency, and reliability.

- Build and maintain data models and data warehouses using industry best practices.

- Monitor, troubleshoot, and improve existing data pipelines.

- Implement data quality frameworks and ensure data governance standards are followed.

- Contribute to documentation, code reviews, and knowledge sharing within the team.

Requirements :

- Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field.

- 3-5 years of experience as a Data Engineer or in a similar data-focused role.

- Strong command of SQL and proficiency in Python.

- Good Engineering practices.

- Experience with data pipeline orchestration tools such as Apache Airflow or equivalent.

- Hands-on experience with cloud data platforms (AWS/GCP/Azure) and services such as S3 Redshift, BigQuery, or Azure Data Lake.

- Experience in data warehousing concepts and tools like Snowflake, Redshift, and Databricks.

- Familiarity with version control tools such as Git.

- Strong analytical and communication skills.

- Exposure to big data tools and frameworks such as Spark, Hadoop, or Kafka.

- Experience with containerization (Docker/Kubernetes).

- Familiarity with CI/CD pipelines and automation in data engineering.

- Awareness of data security, privacy, and compliance principles


info-icon

Did you find something suspicious?