Posted on: 08/04/2026
Description :
About the job (Job Responsibilities) :
1. Platform Strategy & Technical Leadership :
- Own the enterprise data platform strategy and roadmap, guiding the move from legacy systems to Microsoft Fabric, Databricks or Snowflake.
- Act as the primary technical decision-maker for data platform tools, standards, and architecture choices.
- Lead architecture and design reviews, setting the quality bar for the engineering organization.
- Represent data platform engineering in leadership forums and vendor discussions.
- Stay current on data integration and automation trends and apply them where they add value.
2. Project Leadership & Cross-Functional Delivery :
- Lead major data platform projects end-to-end, coordinating across engineering, analytics, IT, and business teams
- Serve as the technical lead the migration program.
- Use a value-first delivery approach prototype early, reduce data debt, and ensure operational readiness.
- Create and maintain architecture documents, runbooks, and status updates.
3. Platform Architecture & Migration Execution :
- Design and execute a Lakehouse architecture, Lakehouse/Warehouse, pipelines. Build the enterprise medallion architecture (BronzeSilverGold).
- Migrate and decommission SAP BW and on-prem SQL systems, including historical loads and validation.
- Define and enforce platform standards, reusable templates, and engineering guardrails.
- Design integration patterns for Palantir Foundry and Aera within the Fabric/Databricks ecosystem.
4. Data Engineering & Pipeline Development :
- Build enterprise-grade pipelines using Fabric Pipelines, ADF, and Databricks.
- Integrate SAP, CRM, ERP, IoT, and SQL sources using appropriate extraction methods. Implement real-time pipelines with Eventstream or Event Hubs.
- Own transformation logic using Delta Live Tables, dbt, or Fabric Notebooks.
- Reduce data duplication through virtualization patterns (Shortcuts, Delta Sharing). Own monitoring, SLAs, alerting, and incident response.
5. DataOps, Automation & Engineering Standards :
- Define CI/CD standards using Azure DevOps and Databricks Asset Bundles.
- Implement automated testing frameworks (DLT checks, Great Expectations, dbt tests). Continuously improve performance, cost efficiency, and observability.
- Own documentation standards and use AI tools to streamline documentation.
6. Data Products, Governance & Observability :
- Design and deliver governed data products with clear ownership, SLAs, and contracts.
- Implement automated data quality controls across ingestion and transformation layers.
- Build and maintain the data catalog and lineage using Purview and Unity Catalog.
- Deploy data observability for schema drift, freshness, volume anomalies, lineage, and cost monitoring.
- Define systems of record and enforce access controls (RBAC, RLS, masking).
7. AI & Intelligence Platform Enablement :
- Provide high-quality, governed, AI-ready data for ML, RAG, and agentic AI workloads.
- Build data prep pipelines for AI (preprocessing, embeddings, retrieval support).
- Integrate Palantir Foundry and its ontology with Fabric/Databricks.
- Champion AI-powered engineering tools (GitHub Copilot, Databricks AI Assistant, Azure AI).
We believe you bring (Education & Experience) :
- Bachelor's degree in Computer Science, Software Engineering, Information Systems, Data Engineering, or a closely related technical discipline.
- Master's degree in Computer Science, Data Science, or Information Management. Equivalent demonstrated expertise through professional experience and certifications will be considered in lieu of an advanced degree.
- 10+ years of hands on experience in data engineering, data platforms, or data architecture 3-5 years at a senior/principal technical level, owning platform strategy and enterprise scale technical direction.
- Proven ability to lead complex, multi workstream data platform projects end to end across engineering, analytics, IT, and business teams.
- Experience owning and delivering an enterprise data platform strategy, including architecture choices, technology selection, and standards.
- Hands-on experience with Microsoft Fabric Databricks, Snowflake Lakehouse environments.
- Strong hands on background in designing enterprise data architectures (lakehouse, data lake, data warehouse, medallion layers) Proven ability to execute enterprise on prem - cloud migrations, including mapping, historical loads, validation, and cutover.
- Strong track record delivering multi style data pipelines : ETL/ELT, replication, virtualization, and streaming/event based.
- Demonstrated skills in pipeline monitoring, data observability, and incident management, including SLA ownership and root cause analysis.
- Experience building data governance foundations (catalog, lineage, quality rules, access controls, ownership models).
- Experience designing and delivering self-service or federated data products.
- Exposure to Palantir Foundry (ontology, pipelines, AIP integration).
- Experience working in manufacturing, chemicals, or process-industry environments. Background enabling data science/ML teams through feature pipelines, MLOps, and model scoring pipelines.
- Experience defining engineering standards and reusable frameworks for a broader data engineering organization.
Competencies :
- Technical Authority, Principal-Level Project Leadership, Architectural Judgment, Organizational Influence, Systems Thinking, DataOps Mindset, Data Observability Ownership, Governance Mindset, AI-Forward Thinking, Executive Communication.
- Advanced SQL, Python, Apache Spark, Data Modeling, ETL/ELT Development, Databricks (Core Engineering Platform), Data Integration Styles & Patterns (Open Table Formats, SAP Integration), Data Products & Self-Service Architecture, DataOps & DevOps (CI/CD for Data Pipelines, Infrastructure-as-Code), Data Quality, Governance & Observability, AI & Intelligence Platform Enablement.
Licenses/ certifications :
- Databricks Certified Data Engineer Professional principal-level mastery of core engineering platform.
- Microsoft Certified : Fabric Analytics Engineer Associate (DP-600) Microsoft Certified : Azure Data Engineer Associate (DP-203) foundational Azure data services.
- Palantir Foundry Certification if available; documented hands-on project experience accepted in lieu.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1626845