HamburgerMenu
hirist

Job Description

Responsibilities :

- Develop solutions for Epsilon that will deliver high quality personalized recommendations across different channels to our customers

- Working with Data science team to ensure seamless integration and support of machine learning models.

- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud-native data architectures.

- Develop end-to-end (Data/Dev/MLOps) pipelines based on in-depth understanding of cloud platforms, AI/ML lifecycle, and business problems to ensure solutions are delivered efficiently and sustainably.

- Collaborate with other members of the team to ensure high quality deliverables

- Learning and implementing the latest design patterns in Data engineering

Qualifications :

- Bachelors degree in engineering and related field with 10+ years of similar experience

- Tech Stack : Python, PySpark, Micro services, Docker, Serverless Frameworks & Databricks.

- Experience working on Gen AI solutions

- Preferably to have certification in Databricks

- Hands on experience building ETL workflows/data pipelines

- Experience in relational and non-relational databases and SQL (NoSQL is a plus).

- Experience with Cloud technologies (AWS or Azure)

- Experience in Designing and building APIs for high transactional volume

- Experience building Data and CI/CD/MLOps pipelines

- Familiarity with automated unit/integration test frameworks

- Experience working on AdTech or MarTech technologies is added advantage

- Good written and spoken communication skills, great teammate.

- Strong analytic thought process and ability to interpret findings

Data Management :

- Good understanding of Data Modeling, Data Warehouse, Data Catalog concepts and tools

- Experience with Data Lake architecture, and with combining structured and unstructured data into unified representations

- Ability to reduce large quantities of unstructured or formless data and get it into a form in which it can be analyzed

- Ability to manipulate large datasets, (millions of rows, thousands of variables)

Software Development :

- Ability to write code in programming languages such as Python, PySpark and shell script on Linux

- Familiarity with software development methodology such as Agile/Scrum

- Love to learn new technologies, keep abreast of the latest technologies within the cloud architecture, and drive your organization to adapt to emerging standard processes

Architecture and Infrastructure :

- Architectural design experience on AWS/Azure & Databricks.

- Architectural design for application with high transactional volume

- Experience in delivering software with AWS EC2, S3, EMR/Glue, Lambda, Data Pipeline, CloudFormation, Redshift/Snowflake etc.

- Good knowledge of working in UNIX/LINUX systems

- Experience designing and building large scale enterprise systems

In addition, the candidate should have strong business insight, and interpersonal and communication skills, yet also be able to work independently. He / she should be able to communicate findings and the way techniques work in a manner that all stakeholders, both technical and non-technical, will understand.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in