Posted on: 27/07/2025
Key Responsibilities :
- Build, maintain, and optimize data models within Databricks Unity Catalog.
- Develop efficient data structures using Delta Lake, optimizing for performance, scalability, and reusability.
- Collaborate with data engineers, architects, analysts, and stakeholders to ensure data model alignment with ingestion pipelines and business goals.
- Translate business and reporting requirements into robust data architecture using best practices in data warehousing and Lakehouse design.
- Maintain comprehensive metadata artifacts including data dictionaries, data lineage, and modeling documentation.
- Enforce and support data governance, data quality, and security protocols across data ecosystems.
- Continuously evaluate and improve modeling processes and standards.
Required Skills and Experience :
- 10+ years of hands-on experience in data modeling in Big Data environments.
- Proficient in modeling methodologies including Kimball, Inmon, and Data Vault.
- Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart.
- Proven experience in Databricks with Unity Catalog and Delta Lake.
- Strong command of SQL and Apache Spark for querying and transformation.
- Hands-on experience with the Azure Data Platform, including :
- Exposure to Azure Purview or similar data cataloging tools.
- Strong communication and documentation skills, with the ability to work in cross-functional agile teams.
Preferred Qualifications :
- Bachelor's or Masters degree in Computer Science, Information Systems, Data Engineering, or related field.
- Experience working in agile/scrum environments.
- Exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) is a plus.
Did you find something suspicious?
Posted By
Posted in
Data Analytics & BI
Functional Area
Data Engineering
Job Code
1519936
Interview Questions for you
View All