Posted on: 18/02/2026
Description :
- 4- 7 years of experience in data engineering, with 3+ years on AWS platforms
- Strong in Python (incl. AWS SDKs), DBT, SQL, and Spark
- Proven expertise with AWS data stack (S3, Glue, EMR, Redshift, Athena, Lambda)
- Hands-on experience with workflow orchestration (Airflow/Step Functions)
- Familiarity with data lake formats (Parquet, ORC, Iceberg) and DevOps practices (Terraform, CI/CD)
- Solid understanding of data governance & security best practices
What would you do here :
- Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live).
- Build and manage S3-based data lakes using modern data formats (Parquet, ORC, Iceberg).
- Deliver end-to-end data solutions with Glue, EMR, Lambda, Redshift, and Athena.
- Implement strong metadata, governance, and security using Glue Data Catalog, Lake Formation, IAM, and KMS.
- Orchestrate workflows with Airflow, Step Functions, or AWS-native tools.
- Ensure reliability and automation with CloudWatch, CloudTrail, CodePipeline, and Terraform.
- Collaborate with analysts and data scientists to deliver business insights in an Agile setting
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1613646