Posted on: 04/12/2025
Description :
- You will engage in a diverse range of impactful customer technical Big Data projects, including the development of reference architectures, how-to guides, and Minimal Viable Products (MVPs).
- Lead strategic initiatives that encompass the end-to-end design, build, and deployment of industry-leading big data and AI applications.
- Provide architectural guidance that fosters the adoption of Databricks across business-facing functions within.
- Experience in Databricks Infrastructure & Connectivity, Integration Patterns, Unity Catalog Migration.
- Experience in Cluster management, Networking, Infrastructure, Platform engineering and Data governance
- Collaborate with platform engineering teams to ensure the effective implementation of Databricks services within our infrastructure.
- Evaluate and assess new features and enhancements on the Databricks platform to ensure we leverage the latest capabilities.
- Design and implement various integration patterns involving Databricks and third party tools such as Collibra and data quality solutions.
- Utilize your expertise in AWS administration and architecture to optimize cloud resources and ensure seamless integration with Databricks.
- Leverage your hands-on experience with Databricks Unity Catalog to implement robust data governance and lineage capabilities.
- Advocate for and implement CI/CD practices to streamline the deployment of Databricks solutions.
- Contribute to the development of a data mesh architecture, promoting decentralized data ownership and accessibility across the organization.
- Understand and articulate the analytics capabilities of Databricks, enabling teams to derive actionable insights from their data.
Responsibilities and qualification :
- Extensive experience working with AWS cloud platforms, including a solid understanding of AWS services and architecture.
- Strong documentation and whiteboarding skills to effectively communicate complex ideas.
- In-depth knowledge of the latest services offered by Databricks, with the ability to evaluate and integrate these services into our platform and Serverless databricks.
- Proven experience in implementing solutions using Databricks Unity Catalog, with a focus on data governance and lineage tracking.
- Demonstrated expertise in migrating from Databricks classic platform to Lakehouse architecture, utilizing Delta file format and/or Delta Live Tables.
- A collaborative mindset with the ability to work effectively across teams and functions.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1584315
Interview Questions for you
View All