Posted on: 12/12/2025
Description :
- Build ETL/ELT workflows using Python, SQL, and cloud-native services.
- Design data models for Customer 360, events, and identity resolution.
- Implement data quality checks, validation, and monitoring.
- Integrate CDP datasets with downstream marketing, analytics, and activation systems.
- Optimize storage, compute, and pipeline performance in cloud environments.
- Collaborate with architects, data science, and product teams on CDP onboarding and enhancements.
Required Skills :
- Strong in Python, SQL, and distributed data processing.
- Experience with cloud platforms (AWS/Azure) and one of the data warehouses (Snowflake/BigQuery/Redshift).
- Hands-on with streaming technologies like Kafka, Kinesis, Event Hubs.
- Understanding of customer data models, CDP concepts, and identity resolution.
- Background in ETL development, data modeling, and API integrations.
- Understanding of API designs, building the API based connectors to fetch data, orchestrate data publishing
Preferred Skills :
- Familiarity with orchestration tools (Airflow) and ETL tools like ( Informatica/Snaplogic)
- Knowledge of PII handling, governance, and compliance frameworks.
What Youll Achieve :
- Enable real-time personalization and marketing activation through reliable data flows
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1589255
Interview Questions for you
View All