Posted on: 03/03/2026
Note : Women Candidates Preferred
Description :
Who we are :
At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, were the house of iconic brands - including Neutrogena, Aveeno, Tylenol, Listerine, Johnsons and BAND-AID Brand Adhesive Bandages that you already know and love. Science is our passion; care is our talent. Our global team is made up of ~ 22,000 diverse and brilliant people, passionate about insights, innovation and committed to delivering the best products to our customers. With expertise and empathy, being a Kenvuer means having the power to impact the life of millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage and have brilliant opportunities waiting for you! Join us in shaping our futureand yours. For more information,
What you will do :
A Data Engineer for Enterprise Data & Analytics role focuses on unifying complex, siloed data from internal ERPs (like SAP) and Product Life Cycle Manager systems like Laboratory Information Systems with external retail signals (POS data, shipment tracking) to optimize every stage from production to the shelf.
- CPG Data Harmonization : Designing pipelines to ingest, harmonize and curate disparate data sources, including ERP (eg., SAP), PLM (eg., Laboratory Information Management System), Point of Sale (POS) data from global retailers (e.g., Walmart, Costco, Carrefour, Tesco, etc.), and internal WMS/TMS systems into a unified Delta Lake.
- Lakehouse Implementation : Developing a Medallion Architecture (Bronze, Silver, Gold) in
Databricks to ensure high-quality, trusted aggregated data model and curated data products.
- Supply Chain Orchestration : Using Databricks Workflows or Azure Data Factory to automate complex ETL/ELT processes that power demand forecasting and inventory replenishment models.
- Operational Monitoring : Implementing real-time streaming (Spark Structured Streaming) to provide immediate visibility into supply chain disruptions, such as logistics bottlenecks or stockouts.
- Governance & Security : Managing fine-grained access control and data lineage for multi-brand or regional datasets using Unity Catalog to ensure compliance with global data privacy standards.
- Education : Bachelor's degree in engineering, computer science, or related field.
- Programming : 5 - 8 years of total work experience, with at least 3 years of experience in advanced SQL, PySpark or Python.
- Core Platform : Good grasp of Databricks (Delta Lake, DLT, Workflows) and Azure/Cloud services (ADLS Gen2, Key Vault).
- Data Lineage : Experience with leveraging Unity Catalog for ensuring lineage from raw to curated data assets.
- DevOps/DataOps : Proficiency in Git, CI/CD (Azure DevOps/GitHub Actions), and Infrastructure as Code (Terraform) for stable, repeatable deployments.
- Agile Development : Experience working in an collaborative agile environment.
- Cloud Architecture : Understanding of cloud architecture principles and best practices
- Experience in the design and build of end-to-end solutions that meet business requirements
and adhere to scalability, reliability, and security standards
Desired Certifications :
- Microsoft Certified : Azure Fundamentals
- Microsoft Certified : Azure Data Engineer Associate
- Databricks Certified Data Engineer Associate
- Competitive Total Rewards Package
- Paid Company Holidays, Paid Vacation, Volunteer Time & More!
- Learning & Development Opportunities
- Employee Resource Groups
This list could vary based on location/region
The job is for:
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1617671