HamburgerMenu
hirist

Job Description

Job Title : Databricks-Unity Catalog Architect

Location : Pune / Bangalore / Hyderabad (Hybrid - 23 days in office)

Experience : 1015 Years (Minimum 4 Years in Architect Role)

Notice Period : Immediate to 30 Days Preferred

About the Role :

We are seeking a seasoned Databricks Architect with deep expertise in Unity Catalog, Delta Lakehouse architecture, and AWS data ecosystem to design and implement scalable, secure, and high-performing data lakehouse platforms in a multi-tenant environment. The ideal candidate will have strong technical leadership, hands-on design, and governance experience.

Key Responsibilities :


- Architect and design Databricks Workspaces tailored for multi-tenant environments using Unity Catalog for centralized governance.


- Build robust Lakehouse solutions using Delta Lake, Spark Structured Streaming, and batch pipelines.

- Drive data modeling, lineage, metadata management, and governance frameworks (RBAC, ABAC).

- Implement workflow orchestration using Databricks Workflows and CI/CD best practices.

- Establish naming conventions, workspace artifacts standards, and enforce coding/documentation quality

across teams.

- Integrate Unity Catalog with external systems and manage catalog versioning, schema evolution, and access policies.

- Lead performance tuning, cost optimization, and security hardening across Databricks and AWS platforms.

- Collaborate with stakeholders to define data strategy, architecture patterns, and data lifecycle policies.

- Provide guidance to data engineers and review architecture decisions, ensuring alignment with enterprise data platform standards.

Required Skills :


Databricks & Unity Catalog :


- 4+ years as Architect in Databricks-based solutions

- Workspace and Unity Catalog architecture for multi-tenant and federated data environments

- Deep knowledge of Delta Lake, Spark Core, Spark SQL, Structured Streaming

- Experience in pipeline development, artifact organization, and governance enforcement

- Proficient with RBAC, ABAC, data lineage, and data quality frameworks

AWS Ecosystem :


- Hands-on with AWS Glue, Redshift, S3, Athena, Lambda

- IAM roles, VPC/network setup, Secrets Manager, API-based integrations

- Strong understanding of cloud security models, cost control, and workload isolation.

Other Skills :


- Proficient in Python, SQL, and CI/CD using GitHub/Azure DevOps

- Ability to document architecture, lead POCs, and present solutions to stakeholders

- Strong communication and team mentoring skills.

Preferred Qualifications :


- Databricks Certifications (Lakehouse Architect, Data Engineer, Unity Catalog Specialist)

- AWS Certified Data Analytics or Solutions Architect Associate/Professional

- Exposure to multi-cloud data architecture, and enterprise data mesh concepts.

Work Environment :


- Hybrid model (23 days in office)

- Locations : Pune / Bangalore / Hyderabad

- Dynamic team structure with global stakeholder engagement

- Culture of innovation, scalability, and impact.

Application Notes :


- Kindly share your resume in CSS format with professional photo embedded

- Mention relevant certifications, POCs handled, and multi-tenant project experience

- Immediate joiners or those with 30 days or less notice preferred.


info-icon

Did you find something suspicious?