We use only essential, cookie‑free logs by default. Turn on analytics to help us improve. Read our Privacy Policy.
Back to case studies
MemoryKnowledge GraphMCPPostgreSQL

Graph-Based Memory Without Embeddings

Knowledge graph approach to AI memory with explicit relationships and human-readable retrieval.

AI Infrastructure2 monthsInternal Infrastructure

Key Results

Predictable retrieval
Explicit relationships
Evolution tracking
Graceful decay

The Problem with Embeddings

Traditional AI memory systems store text blobs with vector embeddings. Semantic search sounds great in theory, but fails in practice:

Semantic gap — query embeddings ≠ narrative embeddings. "Tell me about embodiment" and "On December 9 we discussed embodiment as expansion, not copying" live in different vector spaces.

Temporal meaninglessness — "yesterday", "last week" have no semantic value for embedding models.

Duplication instead of growth — 10 separate memories about one topic instead of one evolving node.

False confidence — system confidently returns *something* — just not what you need.


The Solution: Concept Graphs

Memory Graph takes a fundamentally different approach:

  • Nodes = Concepts that grow and enrich over time
  • Edges = Explicit relationships between concepts
  • Retrieval = Human-readable keys + graph navigation
  • Curation = AI-driven — only the LLM writes and structures memory

Concept Nodes

Four types for different knowledge:

Entity — people, companies, projects, places. Concrete things with identity.

Theme — topics, concepts, areas of interest. Abstract ideas that recur.

Event — specific occurrences with dates. Things that happened.

Insight — realizations, patterns, conclusions. Learned understanding.


Explicit Relationships

Five relationship types make connections meaningful:

  • связан (related) — general connection
  • развивает (develops) — B deepens understanding of A
  • часть (part of) — B belongs to A
  • контекст (context) — B provides context for A
  • противоречит (contradicts) — B conflicts with A

Evolution Tracking

Nodes grow over time:

Summary with embedded timestamps — newest information first, dates inline for context.

Delta history — incremental updates preserve the evolution of understanding.

Archival system — old deltas are archived, not deleted. History is preserved.

Importance decay — unused nodes fade but don't disappear. Graceful forgetting.


Graph Navigation

BFS traversal — find related concepts by walking the graph.

Adjustable depth — 1-3 hops depending on context needs.

Strength-based filtering — strong relationships surface first.

Full-text search — backup search when you need exact matches.


MCP Tools

8 tools expose full functionality:

  • upsert_node — create or update concepts with connections
  • get_node — retrieve concept with all relationships
  • connect_nodes — create or strengthen edges
  • find_related — BFS graph traversal
  • search_nodes — full-text search
  • list_concepts — lightweight index of all concepts
  • consolidate_node — archive old deltas, refresh summary
  • delete_node — permanent removal (with confirmation)

Design Philosophy

Simplicity — minimum structure that doesn't feel like overhead.

Growth over duplication — one rich node beats 50 scattered blobs.

Connectivity — graph as navigation map, not just storage.

Intentionality — save because you understand why, not "just in case".

Graceful decay — unused nodes fade but don't disappear.


Why No Embeddings?

This is a deliberate design choice. Graph navigation with human-readable keys is:

  • Predictable — you know what you'll get
  • Debuggable — you can see why something was retrieved
  • Evolutionary — nodes grow instead of duplicating
  • Relationship-aware — connections are explicit, not inferred

Embeddings have their place. Memory Graph is for when you need reliability over serendipity.


Powers our AI agent memory systems. Available as part of our Custom AI Agents consulting.

Have a similar challenge?

Let's discuss how we can help. Free consultation, no obligations.

Book a Call