HamburgerMenu
hirist

Job Description

Description:

About Koantek :

- Koantek is a Databricks Pure-Play Elite Partner, helping enterprises modernize faster and unlock the full power of Data and AI.

- Backed by Databricks Ventures and honored as a sixAs time Databricks Partner of the Year, we enable global enterprises to modernize at speed, operationalize AI, and realize the full value of their data.

- Our deep expertise spans industries such as healthcare, financial services, retail, and SaaS, delivering end-to-end solutions from rapid prototyping to production-scale AI deployments.

- We deliver tailored solutions that enable businesses to leverage data for growth and innovation.

- Our team of experts utilizes deep industry knowledge combined with cutting-edge technologies, tools, and methodologies to drive impactful results.

- By partnering with clients across a diverse range of industries, from emerging startups to established enterprises we help them uncover new opportunities and achieve a competitive advantage in the digital age.


About the Role :


- As a Solutions Architect at Koantek, you will collaborate with customers to design scalable data architectures utilizing Databricks technology and services.

- The RSA at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind.

- Leveraging your technical expertise and business acumen, you will navigate complex technology discussions, showcasing the value of the Databricks platform throughout the sales process.

- Working alongside Account Executives, you will engage with customers' technical leaders, including architects, engineers, and operations teams, aiming to become a trusted advisor who delivers concrete outcomes.

- This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Platform into the enterprise ecosystem and AWS/Azure/GCP architecture.


The impact you will have :


Develop Account Strategies :


- Work with Sales and other essential partners to develop strategies for your assigned accounts to grow their usage of Databricks platform.


Establish Architecture Standards :



- Establish the Databricks Lakehouse architecture as the standard data architecture for customers through excellent. Technical account planning.

Demonstrate Value :


- Build and present reference architectures and demos applications to help prospects understand how Databricks can be used to achieve their goals and land new use cases.

Capture Technical Wins :



- Consult on big data architectures, data engineering pipelines, and data science/machine learning projects to prove out Databricks technology for strategic customer projects.

- Validate integrations with cloud services and other third-party applications.

Promote Open-Source Projects :

- Become an expert in and promote Databricksinspired open-source projects (Spark, Delta Lake, MLflow) across developer communities through meetups, conferences, and webinars.

Technical Expertise :

- Experience translating a customer's business needs to technology solutions, including establishing buy-in with essential customer stakeholders at all levels of the business.

- Experienced at designing, architecting, and presenting data systems for customers and managing the delivery of production solutions of those data architectures.

- Projects delivered with hands-on experience in development on databricks.

- Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake.

- Expert-level hands-on coding experience in Spark/Scala, Python or PySpark In-depth understanding of Spark Architecture, including Spark Core, Spark SQL, and Data Frames, Spark Streaming, RDD caching, Spark MLibT/event-driven/microservices in the cloud Deep experience with distributed computing with Spark, with knowledge of Spark runtime.

- Experience with private and public cloud architectures, pros/cons, and migration considerations.

- Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services.

- Familiarity with CI/CD for production deployments Familiarity with optimization for performance and scalability.

- Completed data engineering professional certification and required classes SQL Proficiency: Fluent in SQL and database technology.

Educational Background :



- Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research).

- Relevant certifications (e.g., Databricks certifications, AWS/Azure/GCP AI/ML certifications) are a plus.

Workplace Flexibility :



- This is a hybrid role with remote flexibility.

- On-site presence at customer locations MAY be required based on project and business needs.

- Candidates should be willing and able to travel for short or medium-term assignments when necessary.


info-icon

Did you find something suspicious?