Posted on: 12/11/2025
Description :
Job Summary :
- Assist in blueprinting and designing the organization's data platform, covering all relevant components.
- Collaborate with Integration and Data Architects to ensure seamless integration between systems and data models.
- Participate in discussions to refine and enhance the overall data architecture strategy.
- Involved in the complete data platform lifecycle, ensuring all components function cohesively.
- Act as a Subject Matter Expert, providing deep technical knowledge and advisory support.
- Influence and contribute to key decisions across multiple teams.
- Facilitate workshops and discussions to gather requirements and stakeholder feedback.
- Evaluate and continuously improve data architecture practices for efficiency and effectiveness.
- Design and build advanced data pipelines using Delta Lake, Auto Loader, and Delta Live Tables (DLT).
- Lead and implement enterprise-scale Lakehouse initiatives and modern Data & Analytics Architecture patterns (Data Mesh, Data Products, Lakehouse Architecture).
- Program and debug using Python and PySpark to build scalable ETL/ELT pipelines.
- Architect data ingestion, transformation, and reusable pipeline components using Databricks technologies.
- Implement and manage data governance, cataloging, and security policies using Unity Catalog.
- Enable secure data sharing and cross-platform access, including Lakehouse Federation and Delta Sharing.
- Integrate Databricks with BI tools (Power BI, Tableau, Looker) for real-time reporting and visualization.
- Ensure compliance with data privacy regulations (GDPR, HIPAA) through data anonymization and masking strategies.
- Optimize performance of data solutions and pipelines.
- Lead cross-functional initiatives with data science, analytics, and platform teams to deliver secure, scalable data products.
- Advocate for advanced data platform features (Mosaic AI, Vector Search, Model Serving, Databricks Marketplace).
- Maintain strong data modeling and warehousing practices.
- Experience with major cloud platforms (AWS, GCP, Azure) and relevant cloud-based data engineering services.
- (Preferred) Familiarity with DBT (Data Build Tool).
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1573557
Interview Questions for you
View All