HamburgerMenu
hirist

Databricks Architect - Data Engineering

Kokken Robotics
Multiple Locations
10 - 18 Years

Posted on: 22/07/2025

Job Description

Job Description :

- Overall 10-18 yrs. of Data Engineering experience with Minimum 4+ years of hands on experience in Databricks. Ready to travel Onsite and work at client location.

- Proven hands-on experience as a Databricks Architect or similar role with a deep understanding of the Databricks platform and its capabilities.

- Analyze business requirements and translate them into technical specifications for data pipelines, data lakes, and analytical processes on the Databricks platform.

- Design and architect end-to-end data solutions, including data ingestion, storage, transformation, and presentation layers, to meet business needs and performance requirements.

- Lead the setup, configuration, and optimization of Databricks clusters, workspaces, and jobs to ensure the platform operates efficiently and meets performance benchmarks.

- Manage access controls and security configurations to ensure data privacy and compliance.

- Design and implement data integration processes, ETL workflows, and data pipelines to extract, transform, and load data from various sources into the Databricks platform.

- Optimize ETL processes to achieve high data quality and reduce latency.

- Monitor and optimize query performance and overall platform performance to ensure efficient execution of analytical queries and data processing jobs.

- Identify and resolve performance bottlenecks in the Databricks environment.

- Establish and enforce best practices, standards, and guidelines for Databricks development, ensuring data quality, consistency, and maintainability.

- Implement data governance and data lineage processes to ensure data accuracy and traceability.

- Mentor and train team members on Databricks best practices, features, and capabilities.

- Conduct knowledge-sharing sessions and workshops to foster a data-driven culture within the organization.

- Will be responsible for Databricks Practice Technical/Partnership initiatives.

- Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.

Roles & Responsibilities :

- Bachelor's or Masters degree in Computer Science, Information Technology, or related field.

- In depth hands-on implementation knowledge on Databricks. Delta Lake, Delta table - Managing Delta Tables, Databricks Cluster Configuration, Cluster policies.

- Experience handling structured and unstructured datasets

- Strong proficiency in programming languages like Python, Scala, or SQL.

- Experience with Cloud platforms like AWS, Azure, or Google Cloud, and understanding of cloud-based data storage and computing services.

- Familiarity with big data technologies like Apache Spark, Hadoop, and data lake architectures.

- Develop and maintain data pipelines, ETL workflows, and analytical processes on the Databricks platform.

- Should have good experience in Data Engineering in Databricks Batch process and Streaming

- Should have good experience in creating Workflows & Scheduling the pipelines.

- Should have good exposure on how to make packages or libraries available in DB.

- Familiarity in Databricks default runtimes

- Databricks Certified Data Engineer Associate/Professional Certification (Desirable).

- Should have experience working in Agile methodology

- Strong verbal and written communication skills.

- Strong analytical and problem-solving skills with a high attention to detail.

Mandatory Skills :

Databricks, Unity Catalog, Pyspark, ETL, SQL, Delta Live Tables


info-icon

Did you find something suspicious?