Add persistent memory to your AI agents with semantic search and automatic fact extraction. Built on PostgreSQL + pgvector.
from cluttr import Cluttr
memory = Cluttr(config)
async with memory:
# Store memories from conversations
await memory.add( messages, user_id="user_123")
# Search relevant memories
results = await memory.search("...", user_id="user_123")Cluttr handles the complexity of memory management so you can focus on building great AI experiences.
LLM-powered extraction of important facts, preferences, and context from conversations.
Find relevant memories using vector similarity search powered by pgvector.
Automatic summarization of images in conversations. Supports base64, URLs, and file paths.
Smart semantic similarity checks prevent storing redundant information.
Works with OpenAI (GPT-4o-mini) and AWS Bedrock (Claude + Titan).
Built with async/await from the ground up for high-performance applications.
Get up and running in minutes.
uv add cluttrconfig = {
"vector_db": {
"engine": "postgres",
"host": "localhost",
"database": "cluttr",
"user": "postgres",
"password": "secret",
},
"llm": {
"provider": "openai", # or "bedrock"
"api_key": "sk-...",
},
}memory = Cluttr(config)
async with memory:
# Add memories from conversation
await memory.add([
{"role": "user", "content": "I love Python!"},
{"role": "assistant", "content": "Great choice!"},
], user_id="user_123")
# Search memories
results = await memory.search(
"programming preferences",
user_id="user_123"
)