Posted on: 06/05/2026
Job Title : AWS Data Platform Engineer (Data Science + DevOps)
Location : Flexible / Remote / [Your Location]
Experience Required : 5+ years
About the Role :
We are looking for an AWS Data Platform Engineer with a strong mix of data engineering, data science, and DevOps expertise. This hybrid role focuses on building scalable data platforms, enabling machine learning workflows, and automating cloud infrastructure and deployments across our AWS ecosystem.
Key Responsibilities :
- Design, build, and maintain scalable data platforms and pipelines on AWS
- Implement and manage infrastructure using Terraform (Infrastructure as Code)
- Develop and optimize data transformation workflows using DBT
- Collaborate with data scientists to build, deploy, and monitor machine learning models
- Perform exploratory data analysis (EDA) and support feature engineering
- Build and maintain CI/CD pipelines for data and ML workflows (MLOps)
- Monitor data pipelines, infrastructure, and model performance
- Ensure best practices in cloud architecture, security, and cost optimization
- Work closely with cross-functional teams across data, engineering, and product
Required Skills & Qualifications :
- Strong hands-on experience with AWS services: EC2, S3, Lambda, RDS, Redshift, Glue, IAM
- Proficiency in Python for data engineering and data science tasks
- Experience with Terraform or similar IaC tools
- Strong understanding of DevOps practices (CI/CD, automation, monitoring)
- Hands-on experience with DBT (Data Build Tool)
- Solid understanding of data science fundamentals: data analysis, feature engineering, model evaluation
- Experience with libraries like Pandas and NumPy
- Familiarity with Git and version control
Preferred Qualifications :
- Experience with MLOps and deploying ML models in production
- Familiarity with ML frameworks (Scikit-learn, TensorFlow, or PyTorch)
- Experience with orchestration tools (Airflow, Step Functions)
- Knowledge of containerization (Docker, Kubernetes)
- Experience with data warehousing tools (Snowflake, Redshift, BigQuery)
- Exposure to experiment tracking and model monitoring tools
What We Offer :
- Competitive compensation and benefits
- Opportunity to work across data engineering, DevOps, and data science
- Flexible work environment
- High ownership and growth opportunities
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1633901