HamburgerMenu
hirist

FirstHive - Lead DataOps Engineer - ETL/ELT Pipelines

Posted on: 13/11/2025

Job Description

Job Designation : Senior Data Ops Engineer Team Lead

Job Location : Bengaluru

Key Responsibilities :

Leadership & Team Management :

- Lead and mentor a team of DataOps engineers in designing and maintaining robust data pipelines.

- Plan, assign, and review team tasks to ensure timely and quality delivery.

- Collaborate with data engineers, data scientists, and business teams to prioritize data needs and ensure alignment with organizational goals.

- Drive best practices in coding standards, documentation, and deployment automation.

Technical Delivery :


- Design and implement scalable ETL/ELT pipelines using Pentaho, StreamSets, and Python-based frameworks.

- Manage real-time and batch data ingestion using Kafka for streaming and MySQL/Snowflake for storage and transformation.

- Implement and maintain data quality checks, validation, and reconciliation frameworks.

- Ensure pipeline observability, error handling, and alerting mechanisms for proactive issue resolution.

- Optimize Snowflake and MySQL queries for performance and cost efficiency.

- Lead migration or modernization initiatives (e.g., on-prem to Snowflake/cloud).

Governance & Operations :

- Maintain data security, access control, and compliance with enterprise standards.

- Define and track DataOps KPIs such as pipeline success rates, latency, and data quality metrics.

- Partner with Infrastructure and DevOps teams for seamless environment management and scalability.

Technical Skills Required :

Databases :

- Strong expertise in MySQL (query optimization, stored procedures, schema design).

- Advanced knowledge of Snowflake (data modelling, performance tuning, cost optimization).

ETL & Data Pipeline Tools :

- Hands-on experience with Pentaho Data Integration (Kettle) and/or StreamSets for ETL/ELT automation.

Streaming :

- In-depth understanding of Apache Kafka (topic configuration, producer/consumer setup, schema registry, stream processing).

Programming :

- Proficient in Python for data automation, transformation scripts, and integration with APIs.

Monitoring & Observability :

- Familiarity with Grafana, Prometheus, or similar tools for performance and error tracking.

Cloud :

- Exposure to AWS/Azure/GCP data stack (S3, Lambda, Glue, Dataflow, etc.).


info-icon

Did you find something suspicious?