Posted on: 27/03/2026
Description :
We are seeking a Data Engineer with strong ETL fundamentals and cloud proficiency (GCP/BigQuery preferred) to build and maintain scalable data pipelines and transformation workflows.
Key Responsibilities :
- Pipeline Development : Design and maintain scalable ETL/ELT workflows and data models using Dataform or dbt.
- Cloud Architecture : Develop and optimize solutions on GCP (BigQuery/GCS) or AWS/Azure.
- Collaboration : Work with analysts and stakeholders to deliver reliable data products and support data mesh architecture.
- Operations : Apply data governance, troubleshoot pipeline issues, and drive proactive monitoring.
Requirements :
- Experience: 6+ years of hands-on Data Engineering experience with end-to-end pipeline ownership.
Technical Skills :
- Strong SQL (complex queries/optimization) and Python.
- Hands-on experience with Dataform, dbt, or BigQuery.
- Proficiency in Cloud Platforms (GCP preferred, AWS, or Azure) and object storage (GCS, S3, ADLS).
- Data Modeling : Solid experience in transformation logic, workflow orchestration, and scalable data
processing.
- Preferred : Exposure to Spark/PySpark, data migration projects, and data mesh concepts.
- Education : Bachelors or Masters degree in Computer Science or a related field.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1624077