HamburgerMenu
hirist

Data Integration & Financial Analytics Engineer

Vikgol
Delhi NCR
3 - 7 Years

Posted on: 16/12/2025

Job Description

Job Title : Data Integration & Financial Analytics Engineer

Experience : 3 - 7 Years

Employment Type : Full-time

Location : Delhi

Joining : Immediate

About the Role :


- We are looking for a highly skilled Data Integration & Financial Analytics Engineer to design, build, and maintain scalable data pipelines and analytics systems for fintech and data-driven platforms. This role involves working extensively on ETL and reverse-ETL workflows, API-based integrations, financial data processing, and analytics automation.


- The ideal candidate will have a strong foundation in data engineering and analytics, hands-on experience with modern ETL orchestration tools, and exposure to financial systems such as payments, lending, ledgers, and reconciliation. You will collaborate closely with product, engineering, analytics, and business teams to deliver reliable, high-performance data solutions that power critical financial and operational use cases.

Key Responsibilities :

Data Integration, ETL & Reverse-ETL :


- Design, develop, and maintain scalable ETL and reverse-ETL pipelines across multiple data sources and destinations.


- Integrate data from fintech platforms, banking systems, lending platforms, ERPs, CRMs,

payment gateways, logistics systems, and healthcare systems.


- Build and orchestrate workflows using tools such as Apache Airflow, Dagster, or Prefect.


- Implement data transformations and modeling using DBT, Python-based processing

frameworks, and SQL.


- Develop custom reverse-ETL solutions using APIs and microservices to push data back into

operational systems (CRMs, finance tools, internal platforms).


- Ensure data quality, accuracy, reliability, scalability, and performance across all pipelines.


- Implement monitoring, alerting, and logging to proactively identify and resolve data issues.

Data Processing & Architecture :


- Process large-scale datasets using Python, Pandas, Spark, or Apache Beam.


- Build and maintain batch and real-time data pipelines using Kafka, Kinesis, RabbitMQ, or similar streaming platforms.


- Design, optimize, and maintain data storage solutions across PostgreSQL, MySQL, BigQuery, Snowflake, MongoDB, and Redis.



- Design and consume REST, GraphQL, and gRPC APIs for data ingestion and data serving.


- Optimize query performance, indexing strategies, and data models for analytical and operational workloads.

Financial Analytics & Automation :


- Develop and support ledger systems, reconciliation engines, and financial reporting pipelines.


- Work with transactional data including payments, settlements, refunds, fees, and interest

calculations.


- Implement workflows for risk scoring, anomaly detection, and credit decisioning.


- Build and maintain analytics dashboards and reports for financial, operational, and business

insights using tools like PowerBI, Metabase, or Superset.


- Integrate and analyze data from payment gateways, banks, and third-party financial

platforms.


- Automate financial workflows using data-driven pipelines and AI-powered solutions where

applicable.

AI & Advanced Analytics (Good to Have) :


- Support LLM-backed analytics and automation use cases.


- Build Retrieval-Augmented Generation (RAG) pipelines using vector databases such as

Pinecone, Chroma, Weaviate, or Qdrant.


- Work with frameworks like LangChain and LlamaIndex and open-source LLMs such as Llama,

Mistral, DeepSeek, or Kimi.


- Assist with embedding pipelines, fine-tuning workflows, and AI-driven insights for analytics

and financial products.

Required Skills & Experience :

Core Technical Skills :


- 3+ years of hands-on experience in Data Engineering, ETL, or Analytics Engineering.


- Strong proficiency in Python (mandatory). Experience with Node.js or Java is a plus.


- Hands-on experience with ETL orchestration tools such as Airflow, Dagster, or Prefect.


- Experience with reverse-ETL tools (Hightouch) or custom reverse-ETL implementations.


- Strong SQL skills with experience in relational and analytical databases.


- Experience integrating systems using APIs and event-driven architectures.

Financial Domain Experience :


- Exposure to fintech, banking, payments, lending, or financial analytics platforms.


- Understanding of ledgers, transactions, reconciliation, risk models, and financial reporting.


- Experience building and maintaining BI dashboards using PowerBI, Metabase, Superset, or

similar tools.

Cloud & Platforms :


- Experience working with AWS and/or GCP cloud services.


- Familiarity with scalable, cloud-native data architectures and best practices.

Good to Have :


- Experience with real-time streaming and event-driven data pipelines.


- Exposure to ML models for risk scoring, fraud detection, or anomaly detection.


- Experience with LLMs, RAG systems, and AI-driven analytics.


- Background in microservices architecture and distributed systems.

Soft Skills :


- Strong analytical and problem-solving abilities.


- Ability to work independently as well as in cross-functional teams.


- Excellent communication skills to interact with both technical and business stakeholders.


- High ownership, attention to detail, and a proactive approach to problem-solving.

Why Join Us :


- Opportunity to work on high-impact fintech and data-driven platforms.


- Exposure to modern data stacks, cloud-native architectures, and AI-powered analytics.


- Fast-paced environment with ownership of systems end-to-end.


- Opportunity to design and scale financial analytics and data integration platforms from the ground up.


info-icon

Did you find something suspicious?