HamburgerMenu
hirist

Job Description

Description :

Responsibilities :


- Design and implement scalable data pipelines using SQL and Python.

- Work with at least one data warehouse platform.

- Apply data warehousing methodologies and integration patterns.

- Build ETL/ELT pipelines using SSIS, Informatica, DBT, or Azure Data Factory.

- Develop SQL- and Python-based data processing workflows.

- Optimize pipelines for performance, reliability, and scalability.

- Collaborate with cross-functional teams on cloud-based data solutions.

Requirements :


- 4 - 7 years of experience in designing and building enterprise-grade data engineering solutions with strong foundations in data warehousing, ETL/ELT, and cloud platforms.

Mandatory Skills :


- Strong SQL skills.

- Hands-on experience with Python-based data pipelines.

- Experience with at least one ETL/ELT tool (SSIS / Informatica / DBT / ADF).

- Solid understanding of data warehousing concepts.

- Exposure to at least one cloud platform (GCP preferred).

- Strong knowledge of query and code optimization techniques.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in