Posted on: 06/04/2026
Description :
- Proficiency in AWS BigData
- Strong understanding of cloud computing concepts and services
- Experience with data processing frameworks such as Apache Hadoop or Apache Spark
- Familiarity with data storage solutions including Amazon S3 and Amazon Redshift
- Ability to design and implement data pipelines for efficient data processing
- Should have minimum 5 years of experience in AWS BigData
Roles & Responsibilities:
- Expected to be an SME, collaborate and manage the team to perform
- Responsible for team decisions
- Engage with multiple teams and contribute on key decisions
- Provide solutions to problems for their immediate team and across multiple teams
- Facilitate knowledge sharing and mentoring within the team to enhance overall team capabilities
- Monitor project progress and ensure alignment with project timelines and deliverables
- Knowledge of AWS services such as EC2, S3, RDS, Lambda, and CloudFormation
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1626338