Add persistent memory to your AI app in 3 minutes. Your users' preferences, corrections, and facts follow them across every model.
1
Change One Line — Get Memory Instantly
Already using OpenAI or Anthropic? Just change the base_url. No SDK, no install, no new code. Your app gets persistent memory across every session.
# Before (no memory):client = OpenAI()# After (persistent memory forever):client = OpenAI(base_url="http://localhost:8181/v1")# That's it. Same code. Every conversation now remembered.
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "I prefer Python"}]
)
# Next session, GPT-4 already knows the user prefers Python.
# Before (no memory):client = anthropic.Anthropic()# After (persistent memory forever):client = anthropic.Anthropic(base_url="http://localhost:8181")# Same code. Claude now remembers every user.
message = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "I just switched to TypeScript"}]
)
# LangChain — same: just set base_url
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-4",
base_url="http://localhost:8181/v1"
)
# Every LangChain chain now has persistent user memory.
2
Want More Control? Use the Brain SDK
For developers who want to learn/ask/forget programmatically.
from clsplusplus import Brain
brain = Brain("alice")
brain.learn("Works at Google as a senior engineer")
brain.learn("Prefers Python and VS Code")
brain.ask("Where does she work?")
# → ["Works at Google as a senior engineer"]
brain.context("coding help")
# → "Known facts about this user:\n- Works at Google...\n- Prefers Python..."# 11 methods: learn, ask, context, forget, wrap, absorb, teach, watch, correct, chat, who