Posted on: 01/09/2025
Job Description :
This is a remote position.
Graph Data Engineer :
Role : Graph Data Engineer
Experience : 3 to 5 years
Mandatory Skill : Graph Data Engineer, Neo4j (Minimum 1 year required), GCP, Graph data models (LPG, RDF)
Notice : Maximum 30 Days
Hiring Location : PAN India
Job Description :
We are seeking a skilled Graph Data Engineer to join our team for a complex Supplier Chain Project.
As a Graph Data Engineer, you will be responsible for implementing and optimizing data solutions in the Graph DB space.
You will work with Neo4j and GCP technologies to model and manage graph data, enabling advanced analytics and visualization for the Supplier Chain Project.
Key Required Skills :
- Proficiency in graph data modeling, including experience with graph data models (Labeled Property Graphs - LPG, Resource Description Framework - RDF) and the graph query language Cypher.
- Strong experience with Neo4j, including working with Neo4j Aura and optimizing complex queries for performance.
- Familiarity with Google Cloud Platform (GCP) stacks such as BigQuery, Google Cloud Storage (GCS), and Dataproc.
- Desirable : Experience in PySpark and SparkSQL for data processing and analysis.
- Ability to expose graph data to visualization tools like Neo Dash, Tableau, and Power BI.
The Expertise You Have :
- Bachelors or masters degree in a technology-related field (e.g., Engineering, Computer Science, etc.)
- Demonstrable experience in designing and implementing data solutions in the Graph DB space.
- Hands-on experience with graph databases, particularly Neo4j (preferred) or any other.
- Experience in tuning graph databases for performance optimization.
- Profound understanding of graph data model paradigms (LPG, RDF) and graph query language (Cypher).
- Strong grasp of graph data modeling, schema development, and data design.
- Proficiency in relational databases and hands-on SQL experience.
Desirable (Optional) Skills :
- Knowledge of data ingestion technologies (ETL/ELT) and messaging/streaming technologies (GCP Data Fusion, Kinesis/Kafka).
- Experience with API development and in-memory technologies.
- Understanding of developing highly scalable distributed systems using open-source technologies.
- Prior experience in Supply Chain Data is desirable but not essential
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1538829
Interview Questions for you
View All