Memory and Context / Knowledge Representation

Knowledge Graph

Advanced [4/5]
KG Semantic network Entity graph

Definition

Knowledge graphs represent information as a network of entities (nodes) and relationships (edges). They encode structured knowledge that can be queried, reasoned over, and used to enhance LLM responses with factual grounding.

In LLM applications, knowledge graphs provide explicit factual context that complements the model's implicit knowledge.

Key Concepts

  • Entities: Nodes representing things (people, places, concepts)
  • Relations: Edges describing how entities connect
  • Triples: (subject, predicate, object) knowledge units
  • Graph RAG: Combining KG retrieval with LLM generation

Examples

Structure
Knowledge Graph Visualization
┌───────────────┐ │ Einstein │ │ (Person) │ └───────┬───────┘ │ ┌────────────┼────────────┐ │ │ │ ▼ ▼ ▼ "born_in" "developed" "won" │ │ │ ▼ ▼ ▼ ┌─────────┐ ┌───────────┐ ┌────────────┐ │ Ulm │ │ Relativity│ │Nobel Prize │ │ (City) │ │ (Theory) │ │ (Award) │ └─────────┘ └─────┬─────┘ └────────────┘ │ ▼ "describes" │ ▼ ┌──────────────┐ │ Space-Time │ │ (Concept) │ └──────────────┘ Triples: - (Einstein, born_in, Ulm) - (Einstein, developed, Relativity) - (Einstein, won, Nobel Prize) - (Relativity, describes, Space-Time)
Implementation
Graph RAG Pipeline
from neo4j import GraphDatabase class GraphRAG: def __init__(self, uri, user, password): self.driver = GraphDatabase.driver(uri, auth=(user, password)) def extract_entities(self, query, llm): """Use LLM to extract entities from query""" prompt = f"Extract key entities from: '{query}'" return llm.generate(prompt) # ["Einstein", "physics"] def retrieve_subgraph(self, entities, hops=2): """Get relevant knowledge graph neighborhood""" cypher = """ MATCH (e)-[r*1..{hops}]-(connected) WHERE e.name IN $entities RETURN e, r, connected """ with self.driver.session() as session: result = session.run(cypher, entities=entities, hops=hops) return self.format_as_context(result) def answer(self, query, llm): """Full Graph RAG pipeline""" entities = self.extract_entities(query, llm) context = self.retrieve_subgraph(entities) prompt = f"""Using this knowledge: {context} Answer: {query}""" return llm.generate(prompt)

Interactive Exercise

Model as Knowledge Graph

Convert this sentence into knowledge graph triples:

Sentence: "Apple Inc., founded by Steve Jobs in Cupertino, makes the iPhone."

Pro Tips
  • Use LLMs to automatically extract KG triples from text
  • Start with 1-2 hop neighbors to avoid context bloat
  • Combine KG retrieval with vector search for best results
  • Knowledge graphs excel at multi-hop reasoning queries

Related Terms