HamburgerMenu
hirist

Job Description

About the job :

Come work at a place where innovation and teamwork come together to support the most exciting missions in the world!

Job Description :

We are seeking a talented Lead Big Data Engineer to deliver roadmap features of Unified Asset Inventory.

This is a great opportunity to be an integral part of a team building Qualys next generation Micro-Services based platform processing over a 100 million transactions and terabytes of data per day, leverage open-source technologies, and work on challenging and business-impacting projects.

Responsibilities :

- You will be building the Unified Asset Management product in the cloud

- You will be building highly scalable Micro-services that interacts with Qualys Cloud Platform.

- Research, evaluate and adopt next generation technologies

- Produce high quality software following good architecture and design principles that you and your team will find easy to work with in the future

- This is a fantastic opportunity to be an integral part of a team building Qualys next generation platform using Big Data & Micro-Services based technology to process over billions of transactions data per day, leverage open-source technologies, and work on challenging and business-impacting initiatives.

Qualifications :

- Bachelors degree in computer science or equivalent

- 10+ years of total experience.

- 4+ years of relevant experience in design and architecture Big Data solutions using Spark

- 3+ years experience in working with engineering resources for innovation.

- 4+ years experience in understanding Big Data events flow pipeline.

- 3+ years experience in performance testing for large infrastructure.

- 3+ In depth experience in understanding various search solutions solr/elastic.

- 3+ years experience in Kafka

- In depth experience in Data lakes and related ecosystems.

- In depth experience of messing queue

- In depth experience in giving requirements to build a scalable architecture for Big data and Micro-services environments.

- In depth experience in understanding caching components or services

- Knowledge in Presto technology.

- Knowledge in Airflow.

- Hands-on experience in scripting and automation

- In depth understanding of RDBMS/NoSQL, Oracle , Cassandra , Kafka , Redis, Hadoop, lambda architecture, kappa , kappa ++ architectures with flink data streaming and rule engines

- Experience in working with ML models engineering and related deployment.

- Design and implement secure big data clusters to meet many compliances and regulatory requirements.

- Experience in leading the delivery of large-scale systems focused on managing the infrastructure layer of the technology stack.

- Strong experience in doing performance benchmarking testing for Big data technologies.

- Strong troubleshooting skills.

- Experience leading development life cycle process and best practices

- Experience in Big Data services administration would be added value.

- Experience with Agile Management (SCRUM, RUP, XP), OO Modeling, working on internet, UNIX, Middleware, and database related projects.

- Experience mentoring/training the engineering community on complex technical issue


info-icon

Did you find something suspicious?