Posted on: 06/08/2025
Key Responsibilities :
- Design, build, and maintain scalable data pipelines for processing financial data across functions like loan origination, disbursement, collection, and regulatory reporting.
- Develop and optimize ETL workflows to extract data from core banking systems, loan management platforms, payment gateways, credit bureaus, and other sources.
- Perform complex data blending and transformation to merge internal financial datasets with external credit scores, market data, and compliance-related information.
- Ensure data quality, consistency, and regulatory compliance through robust transformation logic and validations.
- Build and maintain data warehouse architectures, including cloud-based solutions (e.g., Snowflake, Redshift, BigQuery).
- Collaborate closely with teams in finance, risk management, underwriting, and compliance to align data models and outputs with business needs.
- Implement data validation frameworks and monitoring systems to guarantee data accuracy for internal use and regulatory reporting.
- Tune and optimize query performance for large-scale datasets, improving data accessibility and processing speed.
Technical Skills :
- Proficiency in SQL, Python, and/or Scala for advanced data processing.
- Hands-on experience with ETL tools like Informatica, Talend, SSIS, or similar.
- Experience with modern data warehousing platforms such as : Snowflake, Amazon Redshift, Google BigQuery, or traditional enterprise DWH systems.
- Knowledge of big data technologies : Hadoop, Apache Spark, Kafka.
- Working experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
Domain Knowledge :
- Strong understanding of financial products and the loan lifecycle from origination to closure.
- Familiarity with loan origination systems, collections processes, and credit risk evaluation.
- Knowledge of financial compliance and regulatory frameworks such as : RBI guidelines, CIBIL data formats, and other industry standards.
Preferred Qualifications :
- Experience in real-time data streaming and event-driven data architectures.
- Familiarity with data governance frameworks, data lineage, and metadata management tools.
- Certifications in cloud platforms (AWS, Azure, GCP) or data engineering tools.
- Understanding of machine learning workflows related to credit scoring or risk assessment.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1526063
Interview Questions for you
View All