HamburgerMenu
hirist

Data Engineer - Python/Apache Airflow

Dash Hire
Multiple Locations
1 - 5 Years

Posted on: 19/11/2025

Job Description

Description :

Responsibilities :

- Assemble large, complex data that meet functional/non-functional business requirements.

- Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing transformation for greater scalability, etc.

- Use the infrastructure/services required for optimal extraction, transformation, and loading of data from a wide variety of data sources using GCP services.

- Work with stakeholders, including the Product, Data and Design teams, to assist with data-related technical issues and support their data requirements.

Requirements :


- Bachelor's degree with a Minimum of 1.5+ years of experience working in globally distributed teams successfully.

- Must have experience working on Python and data handling frameworks(Spark, beam, etc).

- Apply experience with cloud storage and computing for data pipelines in GCP (GCS, BQ, composer, etc)

- Write pipelines in Airflow to orchestrate data pipelines.

- Experience handling data from 3rd party providers is a great plus: Google Analytics, Google Ads, etc.

- Experience in manipulating, processing and extracting value from large disconnected datasets.

- Experience with software engineering practices in data engineering, e. g. release management, testing, etc and corresponding tooling (dbt, great expectations)

- Basic knowledge of dbt is a good to have.

- Knowledge of data privacy and security.

- Excellent verbal and written communication skills.


info-icon

Did you find something suspicious?