App 09 — Memory

Context that persists.

Per-user conversation memory with relevance search and auto-summarization. Give your LLM apps recall without a vector database.

Start free See the API
Relevant memories for: "deployment pipeline" 3 results
me_8a4f2c
User discussed setting up a CI/CD pipeline with GitHub Actions to deploy to Railway on every push to main.
relevance: 0.87   2 hours ago
me_3b7e1a
User prefers zero-downtime deployments and asked about blue-green vs rolling strategies for their Go service.
relevance: 0.74   1 day ago
me_c91d5f
User's production stack: Go binary on Railway, Cloudflare DNS, SQLite for persistence.
relevance: 0.61   3 days ago
Store & Retrieve
Store conversation snippets per user. Retrieve them later by ID, by user, or by relevance to a query. Simple REST API.
Relevance Search
Trigram-based cosine similarity finds the most relevant memories for any query. No embeddings API, no vector database — runs locally in the binary.
Auto-Summarize
Condense old memories into summary entries automatically. Keeps context fresh without unbounded storage growth.
TTL Expiry
Set expiration on any memory entry: "24h", "7d", "30d". Expired entries are filtered out of all queries automatically.
Per-User Scoping
Every memory is scoped to a user ID. No cross-contamination. Users only see their own context. Clean multi-tenant isolation.
No External Dependencies
Memory runs on SQLite inside the binary. No Pinecone, no Weaviate, no Redis. One fewer service to manage in production.
The API

Store memories, query by relevance, summarize old context — all via REST.

# Store a memory curl -X POST /api/memory/user_123 \ -d '{"content":"User prefers Python for scripting tasks", "expires_in":"30d"}' # Find relevant memories curl /api/memory/user_123/relevant?query=scripting+language&limit=5 # List all memories for a user curl /api/memory/user_123 # Delete a specific memory curl -X DELETE /api/memory/user_123/me_8a4f2c # Summarize old entries (older than 7 days) curl -X POST /api/memory/user_123/summarize

Give your LLM a memory.

Memory ships with every Stockyard instance. Self-hosted or Cloud.

Start free Back to platform