Posted on: 06/01/2026
Core Responsibilities :
- Pipeline Engineering & Orchestration : Designing and/or maintaining scalable ETL/ELT pipelines using Azure Data Factory (ADF) or Databricks Workflows for ingestion and Azure Databricks (PySpark/SQL) for complex transformations.
- Lakehouse Management : Implementing and optimizing Medallion Architecture (Bronze, Silver, Gold layers) using Delta Lake to ensure data consistency through ACID transactions.
- DataOps & Automation : Automating deployments in a scrum of scrum environment, testing, and monitoring of data workflows using Azure DevOps, GitHub Actions, or CI/CD pipelines for Databricks notebooks and jobs.
- User Feedback : Manage tickets intake and drive appropriate resolution by collaborating with other DevOps or Data Engineers.
- Governance & Security : Managing centralized access control, data lineage, and auditing across workspaces using Unity Catalog.
- Performance & FinOps : Tuning Spark jobs (partitioning, Z-ordering, caching) for performance and monitoring cluster usage to optimize Azure and Databricks costs.
- Operational Support : Providing L2/L3 support for production job failures, performing root cause analysis (RCA), and ensuring strict adherence to Service Level Agreements (SLAs).
- Collaboration : Work closely with business analysts, data scientists, and DevOps engineers to ensure successful data platform implementations.
- Communication : Excellent communication and articulation skills especially when engaging with stakeholders.
Required Technical Skills :
- Cloud Platform : Mastery of Azure services, specifically ADLS Gen2, Azure Data Factory, Azure Synapse, and Azure Key Vault.
- Databricks Ecosystem : Expert knowledge of PySpark, Spark SQL, Delta Live Tables (DLT), and Databricks Workflows.
- Programming : High proficiency in Python and SQL; experience with Scala or PowerShell is often a plus.
- Infrastructure as Code (IaC) : Experience in provisioning and managing Azure data resources.
- Monitoring : Familiarity with Azure Monitor, Log Analytics, and Application Insights for proactive alerting.
Qualifications :
- Education : Bachelors or Masters degree in Computer Science, Engineering, or a related quantitative field.
- Experience : 4 to 8 years in data engineering or DataOps, with at least 2 years specifically focused on the Databricks/Azure stack.
Preferred Certifications :
- Microsoft Certified : Azure Data Engineer Associate.
- Databricks Certified Professional Data Engineer.
Whats In It For You :
- Competitive Total Rewards Package-.
- Paid Company Holidays, Paid Vacation, Volunteer Time & More!.
- Learning & Development Opportunities.
- Employee Resource Groups.
This list could vary based on location/region.
Note : Total Rewards at Kenvue include salary, bonus (if applicable) and benefits.
Your Talent Access Partner will be able to share more about our total rewards offerings and the specific salary range for the relevant location(s) during the recruitment & hiring process.
Kenvue is proud to be an Equal Opportunity Employer.
All qualified applicants will receive consideration for employment based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, protected veteran status, or any other legally protected characteristic, and will not be discriminated against on the basis of disability.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1597516