HamburgerMenu
hirist

Senior Data Engineer - ETL/Data Warehousing

TwinPacs Sdn Bhd
Hyderabad
6 - 9 Years

Posted on: 25/07/2025

Job Description

Job Description :

We are scaling an AI/ML enabled Enterprise SAAS solution to help manage cash performance of large enterprises, including multiple Fortune-500 companies. You would be owning the architecture responsibility during the 1-10 journey of the product in the FinTech AI space.

Senior Data Engineer (ETL) | 6-9 Y | Hyderabad (Hybrid) | B2B / SaaS - Fintech Exp |

Preferences :

Fintech Exp and Locals, F2F Required - Final Round @ Hyderabad Office

Engineering & CS Graduates from Premium Colleges - IIT / NIT / BITS / REC / NIT

Interview Process :

3 Technical Sessions + 1 CTO Round + 1 F2F - Managerial Round (MUST)

Job Role :

Design, build, and optimize data pipelines to ingest, process, transform, and load data from various sources into our data platform

Implement and maintain ETL workflows using tools like Debezium, Kafka, Airflow, and Jenkins to ensure reliable and timely data processing

Develop and optimize SQL and NoSQL database schemas, queries, and stored procedures for efficient data retrieval and processing

Work with both relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DocumentDB) to build scalable data solutions

Design and implement data warehouse solutions that support analytical needs and machine learning applications

Collaborate with data scientists and ML engineers to prepare data for AI/ML models and implement data-driven features

Implement data quality checks, monitoring, and alerting to ensure data accuracy and reliability

Optimize query performance across various database systems through indexing, partitioning, and query refactoring

Develop and maintain documentation for data models, pipelines, and processes

Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs

Stay current with emerging technologies and best practices in data engineering

Ability to perform independent research to understand the product requirements and customer needs

Communicates effectively with the project teams and other stakeholders. Translate technical details to non-technical audience.

Expert at creating architectural artifacts for Data Warehouse.

Team, effort management.

Ability to set expectations for the client and the team. Ensure all deliverables are delivered in time at highest quality.

Required Skills:

5+ years of experience in data engineering or related roles with a proven track record of building data pipelines and infrastructure

Strong proficiency in SQL and experience with relational databases like MySQL and PostgreSQL

Hands-on experience with NoSQL databases such as MongoDB or AWS DocumentDB

Expertise in designing, implementing, and optimizing ETL processes using tools like Kafka, Debezium, Airflow, or similar technologies

Experience with data warehousing concepts and technologies

Solid understanding of data modeling principles and best practices for both operational and analytical systems

Proven ability to optimize database performance, including query optimization, indexing strategies, and database tuning

Experience with AWS data services such as RDS, Redshift, S3, Glue, Kinesis, and ELK stack

Proficiency in at least one programming language (Python, Node.js, Java)

Experience with version control systems (Git) and CI/CD pipelines

Bachelor's degree in computer science, Engineering, or related field from Premium Colleges - IIT / NIT / BITS / REC / NIT

The job is for:

Women candidates preferred
Differently-abled candidates preferred
Ex-defence personnel preferred
For women joining back the workforce
info-icon

Did you find something suspicious?