HamburgerMenu
hirist

Straive - Senior/Lead Data Engineer - ETL/Apache Airflow

SPI TECHNOLOGIES INDIA PRIVATE LIMITED
5 - 10 Years
Multiple Locations

Posted on: 06/03/2026

Job Description

Description :

We are looking for a Senior/Lead Data Engineer to architect and scale our next-generation data platform. In this role, you will move beyond simple ETL to build sophisticated, automated pipelines that handle both batch and real-time streaming data. You will be responsible for ensuring our data is clean, reliable, and structured for maximum performance in a cloud environment.


Key Responsibilities :

- Pipeline Orchestration : Design and maintain automated ELT/ETL pipelines using Apache Airflow to ensure high availability and efficient data flow.

- Cloud Data Modeling : Architect Star and Snowflake schemas specifically within cloud warehouses like Snowflake, Databricks, or BigQuery to optimize for both query performance and storage costs.

- Transformation & Logic : Utilize dbt (data build tool) to create modular, tested, and well-documented SQL transformation layers, moving away from monolithic scripts.

- Data Quality & Reliability : Proactively catch "dirty data" by implementing "Deadmans Switches" and automated data observability using tools like dbt tests or Great Expectations.

- Real-time Streaming : Develop and monitor streaming pipelines via Kafka or AWS Kinesis to handle high-velocity data such as real-time player telemetry or market feeds (Good to have).

Key Tech Stack & Skills :

Required :

- 5+ years of experience in Data Engineering with a focus on cloud-native tools.

- Mastery of SQL & Python : Ability to write complex transformations and custom automation scripts.

- Orchestration Expert : Hands-on experience with Airflow (DAG development, task scheduling).

- Modern Data Stack (MDS) : Deep experience with dbt and at least one major cloud warehouse (Snowflake, BigQuery, etc.).

- Observability Mindset : A "test-first" approach to data, ensuring if there is a pipeline failure required alerts are in place.

Preferred (The "Good to Have") :

- Experience with Streaming/Event-driven architecture (Kafka, Flink, or Kinesis).

- Experience with Infrastructure as Code (Terraform/Pulumi).

- Knowledge of Data Governance and Security best practices in the cloud.

Required Education & Experience :

- Education : Bachelors degree in a field such as Engineering, Computer Science, or a related technical field.

- Experience : 5+ years of relevant experience

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in