Posted on: 05/11/2025
Responsibilities :
- Design, develop, and maintain scalable data pipelines.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business objectives.
- Implement best practices for data modelling, schema design, and version control to maintain data integrity and consistency.
- Provide technical expertise and support to stakeholders, helping them leverage data effectively for insights and decision-making.
- Optimise data processes and workflows to ensure efficiency, reliability, and performance.
Requirements :
- 5+ years of experience with Python, Apache Airflow, and data lake technologies (e. g., Snowflake).
- Proven ability to build and optimise large-scale data pipelines.
- Strong SQL skills for complex data analysis.
- Familiarity with data orchestration tools like Apache Airflow; understanding of DAGs is a plus.
- Experience with AWS services, including IAM, S3 Lambda, CloudFormation, DynamoDB, RDS, ECS/EKS, EC2
- Proficiency in Agile methodology.
- Experience working with OneTrust APIs and establishing data sources.
- Knowledge of NIST Secure Software Development Framework and OWASP SAMM.
- Understanding of risk management methodologies (e. g., NIST RMF).
- Familiarity with modern data warehousing concepts.
- Experience with DevOps practices and automation tools.
- Problem-solving skills and ability to work independently and collaboratively.
- Experience with Infrastructure as Code (IaC) like Terraform and CloudFormation.
- Understanding of OWASP Top 10 vulnerabilities.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1570360
Interview Questions for you
View All