Posted on: 06/10/2025
Job Description :
- Design, develop, and maintain data pipelines and ETL processes for efficient data integration and transformation.
- Manage and optimise data storage and data flows on Oracle Cloud Infrastructure (OCI).
- Work with large-scale datasets and ensure data quality, consistency, and reliability across systems.
- Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions.
- Implement data governance, security, and compliance standards.
- Monitor data pipelines, troubleshoot issues, and ensure high availability of data platforms.
- Optimise database performance and ensure cost-effective cloud resource utilisation.
Qualifications :
- Proficiency in Oracle Cloud Infrastructure (OCI) and Oracle Autonomous Data Warehouse.
- Hands-on experience with ETL tools (e.g., Oracle Data Integrator, Informatica, Talend, or similar).
- Strong knowledge of SQL, PL/SQL, and database performance tuning.
- Experience with data warehousing concepts and big data technologies.
- Familiarity with Python or Scala for data processing and automation.
- Experience with streaming data pipelines (e.g., Kafka, Spark Streaming).
- Knowledge of data modeling and data governance best practices.
- Exposure to containerization (Docker, Kubernetes) is a plus.
- Strong analytical and problem-solving abilities.
- Excellent communication and collaboration skills.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1556512
Interview Questions for you
View All