HamburgerMenu
hirist

Data Pipeline Engineer - ElasticSearch

Dexian
Bangalore
5 - 8 Years

Posted on: 30/08/2025

Job Description

Overview :


We are seeking a skilled and motivated Data Pipeline Engineer to join our team.
In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk.


You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.

Role Profile :


A successful applicant will contribute to several important initiatives including :


- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.

- Design and implement data mapping, transformation, and routing processes to meet

analytics and monitoring requirements.

- Developing automation tools that integrate with in-house developed configuration

management frameworks and APIs.

- Monitor the health and performance of the data pipeline infrastructure.

- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues.


- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.

Required Skills :


- Hands-on experience deploying and managing large-scale dataflow products like Cribl,

Logstash or Apache NiFi.

- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.


- Hands-on experience in developing and validating field extraction using regular expressions.

- A solid understanding of Operating Systems and Networking concepts : Linux/Unix system administration, HTTP and encryption.

- Good understanding of software version control, deployment & build tools using DevOps

SDLC practices (Git, Jenkins, Jira).

- Strong analytical and troubleshooting skills.

- Excellent verbal & written communication skills.

- Appreciation of Agile methodologies, specifically Kanban.

Desired Skills :


- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ.


- Infrastructure automation and integration experience, ideally using Python and Ansible.

- Familiarity with cybersecurity concepts, event types, and monitoring requirements.

- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema

(ECS).


info-icon

Did you find something suspicious?