HamburgerMenu
hirist

Job Description

Description :

Project Description :

Support one of the top Australian banks as they seek to modernise their data and analytics platform.

You will be working directly with IT and business stakeholders in Data and Platform team to implement banks data strategy to become the best AI bank of the world.

Skills required :

Total Years of experience in the range of 4 to 8 years in the following skills :

We use a broad range of tools, languages, and frameworks. We dont expect you to know them all, but experience or exposure with some of these (or equivalents) will set you up for success in this team-

- Extensive experience in designing, building, and delivering enterprise-wide data ingestion, data integration, and data pipeline solutions

- Strong Data Architecture expertise, including different data modelling techniques and design patterns (conceptual, logical, physical, semantic is preferred)

- Strong knowledge of data governance, such as data lineage, technical metadata, data quality, and reconciliation

- Ability to drive platform efficiency through automation and AI capabilities

- AWS Data Stack : EMR, Glue, Redshift, Athena, S3, Lambda, ECS

- Data Orchestration & Pipelines : Airflow, Dataform

- Data Formats & Modelling : Iceberg, JSON, XML, CSV, Data Modelling

- Programming & DevOps : Python, SQL, Git, GitHub Actions, Team City, Jenkins, Octopus, Unix shell scripting

- ETL & Ingestion : File ingress/egress solutions in AWS

- Security and Observability : DevSecOps, Artifactory, Observability tooling

- Testing & Automation : test automation frameworks, Jupyter Notebooks

- Familiarity with data warehousing and build experience in Teradata, Oracle

- Experience in visualisation tools such as Power BI, Tableau

- Familiarity and experience with Agile processes

- AWS Data Engineer Associate certification

Nice To Have :

- AWS Solution Architect certification

- Containerisation (Docker, Kubernetes)

- Data visualisation tools and integration to Tableau, Power BI

- Alation

- Observability tools (i.e., Observe, Splunk, or Prometheus/Grafana).

- Ab initio or DBT tooling

- Experience with Parquet File Format, Iceberg tables

- Glue Data Catalogue & AWS DataZone

- Markets domain knowledge

Responsibilities :

Roles & Responsibilities :

- Design, build, and deliver a new cloud data solution to transform our international regulatory reporting requirements

- Lead the design and delivery of cost-effective, scalable data solutions aligned with strategic goals that meet performance, security, and operational requirements.

- Drive solution architecture decisions, ensuring alignment with enterprise architecture principles and business priorities

- Engineer robust data product assets and pipelines in AWS (S3, Glue, Iceberg, Kinesis, Airflow, Sagemaker, Redshift) that integrate with other applications, including SaaS reporting applications, eg, Axiom

- Provide technical data governance and risk management,

- Lead a team of data engineers providing technical guidance, reviewing work, and mentoring team members to deliver high-quality data products

- Define and implement engineering standards, including data modelling, ingestion, transformation, and egression patterns and reviews

- Collaborate across teams to ensure a secure, efficient, and well-documented solution.

- Learn and contribute to continuous improvement initiatives within the team.


info-icon

Did you find something suspicious?