HamburgerMenu
hirist

Job Description

Roles and Responsibilities :

- You will engage in diverse impactful customer technical Big Data projects, including developing reference architectures, how-to guides, and minimally viable products (MVPs).

- Lead strategic initiatives encompassing the end-to-end design, build, and deployment of industry-leading big data and AI applications.

- Provide architectural guidance that fosters the adoption of Databricks across business-facing functions.

- Collaborate with platform engineering teams to effectively implement Databrick services within our infrastructure.

- Evaluate and assess new features and enhancements on the Databricks platform to ensure we leverage the latest capabilities.

- Designed and implemented various integration patterns involving Databricks and third-party tools such as Collibra and data quality solutions.

- Utilize your AWS administration and architecture expertise to optimize cloud resources and ensure seamless integration with Databricks.

- Leverage your hands-on experience with Databricks Unity Catalog to implement robust data governance and lineage capabilities.

- Advocate for and implement CI/CD practices to streamline the deployment of Databricks solutions.

- Contribute to developing a data mesh architecture, promoting decentralized data ownership and accessibility across the organization.

- Understand and articulate the analytics capabilities of Databricks, enabling teams to derive actionable insights from their data.

Skills And Qualifications

- 7+ years of experience with Big Data technologies, including Apache Spark, cloud-native data lakes, and data mesh platforms, in a technical architecture or consulting role.

- 5+ years of independent experience in Big Data architecture.

- Proficiency in Python coding and familiarity with data engineering best practices.

- Extensive experience working with AWS cloud platforms, including a solid understanding of AWS services and architecture.

- Substantial documentation and whiteboarding skills to effectively communicate complex ideas.

- In-depth knowledge of the latest services offered by Databricks, with the ability to evaluate and integrate these services into our platform.

- Proven experience implementing solutions using Databricks Unity Catalog, focusing on data governance and lineage tracking.

- Demonstrated expertise in migrating from Databricks classic platform to Lakehouse architecture, utilizing Delta file format and/or Delta Live Tables.

- A collaborative mindset with the ability to work effectively across teams and functions


info-icon

Did you find something suspicious?