Posted on: 21/07/2025
Responsibilities :
- Participate in requirements definition, analysis, and the design of logical and physical data models for Dimensional Data Model, NoSQL, or Graph Data Model.
- Lead data discovery discussions with Business in JAD sessions and map the business requirements to logical and physical data modeling solutions.
- Conduct data model reviews with project team members.
- Capture technical metadata through data modeling tools.
- Ensure database designs efficiently support BI and end user requirements.
- Drive continual improvement and enhancement of existing systems.
- Collaborate with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation.
- Collaborate with Data Architects for data model management, documentation, and version control.
- Maintain expertise and proficiency in the various application areas.
- Maintain current knowledge of industry trends and standards.
Required Skills :
- Strong data analysis and data profiling skills.
- Strong conceptual, logical, and physical data modeling for VLDB Data Warehouse and Graph DB.
- Hands-on experience with modeling tools such as ERWIN or another industry-standard tool.
- Fluent in both normalized and dimensional model disciplines and techniques.
- Minimum of 3 years' experience in Oracle Database.
- Hands-on experience with Oracle SQL, PL/SQL, or Cypher.
- Exposure to Databricks Spark, Delta Technologies, Informatica ETL, or other industry-leading tools.
- Good knowledge or experience with AWS Redshift and Graph DB design and management.
- Working knowledge of AWS Cloud technologies, mainly on the services of VPC, EC2, S3, DMS, and Glue.
- Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience).
- Excellent verbal and written communication skills, including the ability to describe complex technical concepts in relatable terms.
- Ability to manage and prioritize multiple workstreams with confidence in making decisions about prioritization.
- Data-driven mentality. Self-motivated, responsible, conscientious, and detail-oriented.
- Effective oral and written communication skills.
- Ability to learn and maintain knowledge of multiple application areas.
- Understanding of industry best practices pertaining to Quality Assurance concepts and procedures.
Education/Experience Level :
- Bachelor's degree in Computer Science, Engineering, or relevant fields with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions.
- 3+ years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB.
- AWS Solutions Architect Professional Level certifications.
- Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems.
Skill Set Required :
- GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery
- Data Modeller - Hands-on data modelling for OLTP and OLAP systems.
- In-Depth knowledge of Conceptual, Logical and Physical data modelling.
- Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same.
- Strong understanding of variables impacting database performance for near-real time reporting and application interaction.
- Should have working experience on at least one data modelling tool, preferably DBSchema.
- People with functional knowledge of the mutual fund industry will be a plus.
- Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery
The job is for:
Did you find something suspicious?
Posted By
Posted in
Data Analytics & BI
Functional Area
Data Analysis / Business Analysis
Job Code
1516044
Interview Questions for you
View All