HamburgerMenu
hirist

Pratt & Whitney - Data Engineer - Python/Kafka

Pratt & Whitney
Bangalore
5 - 7 Years
star-icon
3.9white-divider37+ Reviews

Posted on: 03/08/2025

Job Description

Digital Engine Services - Data Engineer

Experience : 5+ Years

Employment Type : Full-time

Job Overview :

We are seeking a skilled Digital Engine Services - Data Engineer with a minimum of 5 years of experience to join our team. You will be responsible for designing, building, and optimizing our data pipelines, with a strong focus on real-time streaming, data ingestion, and database optimization. The ideal candidate will possess a strong analytical and performance-driven mindset, with expertise in SQL, data visualization, and a variety of programming languages to support our full-stack data ecosystem.

Key Responsibilities :

Data Pipeline Engineering & Event Streaming :

- Architect and maintain real-time streaming pipelines using technologies like Kafka Streams.

- Implement key-based aggregations, windowing, and stateful operations backed by databases such as RocksDB.

- Design event schemas and API contracts that serve internal components and downstream consumers with minimal coupling.

Data Ingestion & Persistence :

- Upgrade and maintain ingestion logic to persist processed outputs into structured databases.

- Work primarily with Microsoft SQL Server for on-prem deployments and a variety of cloud-native databases in AWS.

- Enhance and maintain database APIs for both batch and real-time data consumers throughout the Ground and Analytics pipelines.

Database Optimization & Complex Query Engineering :

- Optimize SQL queries and stored procedures for high-volume transactional loads.

- Collaborate with data analysts and business units to model data tables and relations that support Power BI needs.

- Fine-tune indexing strategies, partitioning, and caching logic.

System Monitoring, Observability & Quality :

- Instrument data pipeline components and APIs with structured logs for ingestion into OpenSearch and visualization in Kibana.

- Conduct continuous quality checks during data transformation and ingestion phases to ensure data traceability and capture anomalies.

Cross-Functional Collaboration :

- Work closely with the team to test and validate data pipeline artifacts.

- Support internal and external developers in producing and consuming data pipelines and APIs through documentation and well-defined contracts/schemas.

Qualifications : You Must Have

- Bachelors degree in Computer Science, Software Engineering, Data Engineering, or a related field.

- Minimum of 5 years of experience in data engineering or a similar role.

- Strong SQL development skills, including indexing, complex joins, window functions, stored procedures, and query optimization.

- Experience with data visualization tools such as Power BI or OpenSearch/Kibana.

- Proficiency in Microsoft Excel for data manipulation and reporting.

- Familiarity with Java, Python, and C# for API development and maintenance.

- Exposure to stream processing, schema registries, API contract versioning and evolution, and stateful operations.

- Analytical & Performance Mindset: Ability to interpret large datasets, draw meaningful conclusions, and present insights effectively while considering latency, throughput, and operational cost.

- Communication Skills: Strong written and verbal communication skills to convey information between back-end developers, data analysts, and system engineers, while maintaining accountability and ownership of data design and change outcomes.

Qualifications We Prefer :


- Experience architecting and maintaining real-time streaming pipelines using Kafka Streams.

- Proficiency in designing event schemas and API contracts that minimize coupling.

- Hands-on experience with both Microsoft SQL Server and cloud-native databases in AWS.

- Proven ability to optimize SQL queries for high-volume transactional loads.

- Experience with monitoring tools like OpenSearch and Kibana.

- Familiarity with data traceability and quality checks during data transformation.

- Strong collaboration skills to support both internal and external developers.


info-icon

Did you find something suspicious?