HamburgerMenu
hirist

Job Description

(Client is Top most FINTECH)-Only diversity we required.

Your key responsibilities :

- You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain

- You are responsible for the support of the migration of current functionalities to Google Cloud

- You are responsible for the stability of the application landscape and support software releases

- You also support in L3 topics and application governance

- You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot)

Your skills and experience :

- You have experience with databases (HDFS, BigQuery, etc.) and development preferably for Big Data and GCP technologies

- Strong understanding of Data Mesh Approach and integration patterns

- Understanding of Party data and integration with Product data

- Your architectural skills for big data solutions, especially interface architecture allows a fast start

- You have experience in at least : Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting

- You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes

- You can work very well in teams but also independent and are constructive and target oriented

- Your English skills are good and you can both communicate professionally but also informally in small talks with the team

info-icon

Did you find something suspicious?