HamburgerMenu
hirist

American Express - Big Data Engineer II - Java/Python

Posted on: 28/08/2025

Job Description

Purpose of the Role :


LUMI is companys largest Big Data Platform, ideally suited for computationally and/or data intensive processing applications. Whether the data needs to be processed in batch, online, or streaming manner, Lumi provides robust capabilities to handle such workloads effectively, in a cost-efficient manner.


A hub of very hardworking Big Data engineers and most exciting & upcoming technologies. Cornerstone platform offers an environment where Engineers are challenged every day to build world class products.


As we embark on the journey to move to public cloud - GCP you will be part of a fast-paced Agile team, design, develop, test, troubleshoot & optimize solutions created to simplify access to the Amexs Big Data Platform.


Focus :


- Designs, develops, solves problems, debugs, evaluates, modifies, deploys, and documents software and systems that meet the needs of customer-facing applications, business applications, and/or internal end user applications.


Organizational Context :


- Member of an engineering or delivery and integration team reporting to an engineer manager or Engineering Director


Responsibilities :


- Implemented enterprise grade robust data migration solutions using Java and Python, facilitating seamless data transfer from on-premises environments to GCP (including Cloud Storage and Big Query), leveraging Apache Airflow and Google Cloud Composer.

- Build secured, and optimized data architectures on GCP by integrating services such as Cloud Storage, Pub/Sub, Dataproc.

- Implement automated solutions for data delivery, monitoring, and troubleshooting.

- Monitor system performance and proactively optimize data pipelines for efficiency

- Troubleshoot and resolve issue.

- Create and maintain comprehensive documentation for tools, architecture, processes, and solutions.

Data Pipeline Development :

- Build, test, and deploy data pipelines to move, transform, and process data from various sources to GCP

- Ensure the reliability, scalability, and performance of data pipelines.

- Utilize GCP's big data technologies such as Big Query, Dataflow, Dataprep, and Pub/Sub to implement effective data processing solutions


Minimum Qualifications :


- Overall, 3 - 5 years of developer experience.

- Proficiency in Java, Python, and shell scripting.

- Strong SQL knowledge

- Proficiency in Google Cloud Platform services especially Cloud Storage, Big query, Dataproc and Pub/Sub.

- Proficiency in RDBMS like Oracle, Postgres or MySQL and good exposure in at least one NoSQL DB like Cassandra.

- Expertise in Git & CICD processes.

- Experience of working in agile application development environment

- Technical support to applications on trouble shooting Environment, software and application-level issues


- Write, test programs using Unix Shell scripting.


Preferred Qualifications :


- Hands on or exposure to DevOps best practices and implementation.

- Hands-on or exposure to platform engineering including networking and firewall.

- Hands-on or exposure to GenAI integrations including LLMs and RAG.


maven-logo

"I began working at American Express in 2007 through a campus recruitment program. I now lead a team managing business intelligence tools and platforms, creating data capabilities and insights for measuring the performance of acquisition channels, marketing campaigns, cobrand products & Digital experience channels. It’s been exciting and rewarding!"...

user_img

Neha Singh<br> Vice President

Enterprise Business Intelligence Engineering

See diversity initiatives

The job is for:

Women candidates preferred
Differently-abled candidates preferred
Ex-defence personnel preferred
For women joining back the workforce
info-icon

Did you find something suspicious?