HamburgerMenu
hirist

Job Description

Description :


At Algonomy, we believe the future of our economy is Algorithmic, where businesses will develop resilient, adaptive and agile decisioning abilities that will constantly test and refine AI-driven actions to create the best personal experience for every individual customer at scale.

We aim to become the algorithmic bridge between consumers and brands/retailers, and to lead our customers through the Algorithmic transformation imperative.

The name Algo-nomy signifies an expertise in algorithms.

As technology evolves our lives (and our clients) at hyper-speed, Algonomy stands as a bold, creative and agile brand; and these are also the very qualities that every digital-first business needs in order to be successful in the new normal.

We are ambitious, we create category leading solutions in our markets, and we are constantly learning, inventing and adapting to stay ahead of our industrys needs

We are looking for a Data Engineer in the Professional services team Customer Data Platform (CDP).

The ideal candidate has experience working with large-scale data processing, real-time ingestion, and customer data models.

Responsibilities :


- Develop and maintain batch and real-time data pipelines for customer, transactional, and behavioral data.

- Build ETL/ELT workflows using Python, SQL, and cloud-native services.

- Design data models for Customer 360, events, and identity resolution.

- Implement data quality checks, validation, and monitoring.

- Integrate CDP datasets with downstream marketing, analytics, and activation systems.

- Optimize storage, compute, and pipeline performance in cloud environments.

- Collaborate with architects, data science, and product teams on CDP onboarding and enhancements.

Required Skills :


- At least 6 years experience in implementation Datawarehouse and BI projects and overall 8 + year of experience.

- Strong in Python, SQL, and distributed data processing.

- Experience with cloud platforms (AWS/Azure) and one of the data warehouses (Snowflake/BigQuery/Redshift).

- Hands-on with streaming technologies like Kafka, Kinesis, Event Hubs.

- Understanding of customer data models, CDP concepts, and identity resolution.

- Background in ETL development, data modeling, and API integrations.

- Understanding of API designs, building the API based connectors to fetch data, orchestrate data publishing

Preferred Skills :


- Experience implementing CDP, MDM, CRM, or marketing data systems.

- Familiarity with orchestration tools (Airflow) and ETL tools like ( Informatica/Snaplogic)

- Knowledge of PII handling, governance, and compliance frameworks.

What Youll Achieve :


- Deliver scalable, high-quality data pipelines for unified customer profiles.

- Enable real-time personalization and marketing activation through reliable data flows


info-icon

Did you find something suspicious?