Soul is the memory and identity layer that makes your AI agent recognizably itself — across conversations, sessions, and the model upgrades that would otherwise erase everything it learned.
The hard part of building AI agents isn't the single conversation. It's everything that has to happen between them.
Soul treats memory and identity as inseparable — because they are. An agent's identity shapes what it remembers. Its memories shape who it becomes.
Specific past interactions, captured and indexed automatically as conversations unfold. The right episode surfaces when it matters, not the most semantically similar text.
Facts learned about the user, the domain, or the world — structured, queryable, and reconciled against contradictions rather than silently overwritten.
Persona, voice, values, and consistent preferences configured above the memory layer. Referenced on every interaction. Survives model upgrades unchanged.
Every feature shaped around what actually breaks in production memory systems — not what breaks in demos.
Built for agents, not documents. Understands recency, importance, and the difference between a passing comment and a stable preference. Surfaces episodes, not passages.
Memories and identity survive when the underlying model is upgraded, replaced, or temporarily routed elsewhere. No model-specific assumptions encoded in the store.
Contradicting facts are reconciled with explicit policies. Details fade faster than preferences. Memory stays clean without application code managing it.
Users review, edit, and delete what an agent knows about them. Deletions propagate fully — not soft deletes living on as embedded fragments somewhere in the system.
Clear partitioning boundaries so memories from one context never bleed into another. Multi-agent systems get a shared substrate with appropriate scoping per participant.
Returns a curated memory set trimmed to fit available context. The agent doesn't waste working memory on noise or get swamped by a slow-growing memory tumor.
Two primary operations, one persistent state. A working memory loop in your agent within the hour.
Configure the agent's persona with a small definition: voice, values, stable preferences. This layer is referenced on every interaction and never drifts.
As conversations unfold, send Soul the events worth remembering. Soul extracts, classifies, indexes, and reconciles — automatically.
At the start of each new interaction, retrieve a curated, context-window-aware set of memories. The agent picks up exactly where it left off.
Upgrade, replace, or route to a different model. Soul's state travels unchanged. The agent remains itself — no rebuilding, no data migration.
What separates Soul from a vector database, a persona library, or a bespoke memory module nobody on the team enjoys maintaining.
An agent's identity shapes what it considers worth remembering. Its memories shape who it becomes. Soul is built on the conviction that the two cannot be cleanly separated — and it designs for both at once.
The patterns that work for RAG against a static knowledge base are the wrong patterns for an agent's memory of a person. Soul surfaces episodes, understands stable versus momentary preferences, and resolves contradictions.
The alternative to model-agnostic memory is rebuilding everything every time a better model arrives. That churn keeps teams stuck on whatever they shipped first. Soul prevents it.
Memory laws are tightening. Agents that cannot honor a deletion request quickly become a liability. Soul's visibility, editing, and deletion are first-class operations from day one — and the deletions are real.
Soul is a strong fit for any AI product where the user expects continuity — which is most AI products worth building.
Personal assistants need continuity almost by definition. An assistant that does not remember its user is not really assisting anyone — it's just a single-session autocomplete.
A coding agent that learned the project's conventions on Monday should not need to be retaught them on Tuesday. Soul retains decisions, preferences, and context across the entire lifetime of the project.
A customer support agent that handled a complex case last week should recognize the same customer and pick up where the conversation left off — without making them repeat their entire history.
In multi-agent systems, agents need to remember what other agents have done, what state the shared work is in, and how their own role fits. Soul provides the substrate for this without brittle ad-hoc state passing.
SDKs in the languages agent teams actually build in. Integrations with the frameworks the ecosystem has settled around. No rearchitecting required.
Memory laws are tightening. Agents that cannot honor deletion requests quickly become a liability. Soul was designed with compliance as a first-class concern from the start.
Full user memory visibility, editing, and real deletion
Partitioned per user, agent, and organization — no memory bleed
Audit trail for every memory write, read, and deletion event
Self-hosted edition — memories stay inside your infrastructure
Conflict resolution policies explicit — no silent overwrites
Annual third-party audit
At rest and in transit
On-premises available
Full right to erasure
Define an identity. Write the first memory. Retrieve context on the next conversation. Most teams have a working loop within an hour. The platform scales from there.