## Your AI Has Amnesia
Here's a workflow every developer knows:
Monday: "Claude, here's how our auth system works. We use JWT with refresh tokens, httpOnly cookies, and the middleware lives in src/auth/. We decided last week to migrate from sessions to tokens
because..."
Tuesday: "Hey Claude, remember the auth system? ...No? Okay, let me explain again."
Every AI conversation starts from zero. You're working with the most capable coding assistant ever built, and it can't remember what you told it yesterday.
This isn't a model problem — GPT-4, Claude, Gemini all handle context brilliantly within a session. The problem is between sessions. There's nowhere for that context to persist.
## The Model Context Protocol (MCP)
In late 2024, Anthropic released the Model Context Protocol — an open standard that lets AI clients discover and use external tools. Think of it as a USB port for AI
capabilities.
MCP is what makes persistent memory possible. Instead of hoping your AI client will eventually add memory features, you can build a memory server that any MCP-compatible client can use.
That's exactly what I built.
## Introducing Wyrm
Wyrm is an MCP server that gives AI agents persistent, searchable memory. It stores everything in a local SQLite database — projects, sessions, tasks, arbitrary data — and
exposes 31 tools for AI agents to read and write.
### How It Works
┌─────────────────────────────┐ │ AI Client │ │ (Claude, Copilot, Cursor) │ └──────────┬──────────────────┘ │ MCP (stdio) ▼ ┌─────────────────────────────┐ │ Wyrm MCP Server │ │ 31 tools for memory ops │
└──────────┬──────────────────┘ │ ▼ ┌─────────────────────────────┐ │ SQLite (WAL mode + FTS5) │ │ ~/.wyrm/wyrm.db │ └─────────────────────────────┘
Your AI client connects to Wyrm via stdio. Wyrm registers its tools. When your AI needs to remember something — or look something up — it calls the appropriate Wyrm tool. All data stays local.
### Setup in 60 Seconds
Install:
bash
git clone https://github.com/ghosts-lk/Wyrm.git
cd Wyrm/packages/mcp-server
npm install && npm run build
npm link
Connect to Claude Desktop (~/.config/claude/claude_desktop_config.json):
{
"mcpServers": {
"wyrm": {
"command": "wyrm-mcp"
}
}
}
Connect to GitHub Copilot (.vscode/settings.json):
{
"mcp": {
"servers": {
"wyrm": {
"command": "wyrm-mcp"
}
}
}
}
That's it. Restart your AI client, and it now has memory.
What Wyrm Remembers
Projects
Wyrm auto-discovers your git repositories and tracks them:
You: "Scan ~/projects for repos"
AI: → calls wyrm_scan_projects
"Found 12 repositories. Registered all projects."
Sessions
Every working session gets stored with context, decisions, and outcomes:
You: "Start a session for the payment API refactor"
AI: → calls wyrm_session_start
"Started session 'payment-api-refactor'. I have context from
3 previous sessions on this project."
Quests (Tasks)
Task tracking built right into your AI workflow:
You: "Add a quest: migrate Stripe webhooks to v2 API, high priority"
AI: → calls wyrm_quest_add
"Quest added. You now have 4 pending quests on this project."
Data Lake
Store any structured data your AI might need later:
You: "Store our API rate limits: free tier 100/hr, pro 1000/hr, enterprise unlimited"
AI: → calls wyrm_data_insert (namespace: "api-config", key: "rate-limits")
"Stored. I'll reference this when working on the API."
Full-Text Search
Everything gets indexed. Search across all projects, all sessions, all data:
You: "Search for anything about database migrations"
AI: → calls wyrm_search
"Found 7 results across 3 projects: migration decisions from
Feb 12, schema changes from the auth refactor, and the
PostgreSQL → SQLite discussion..."
Technical Decisions Worth Noting
Why SQLite?
No setup. No services. npm install and you have a database. SQLite with WAL mode handles concurrent access (multiple AI clients, one database), and FTS5 gives us full-text search without Elasticsearch.
Why FTS5 Triggers?
Every entity type (project, session, quest, data point) needs to be searchable. Instead of manual index management, database triggers automatically update the FTS5 index on every insert and update. One search
query covers everything.
Why Markdown Sync?
Wyrm maintains .wyrm/ directories in each tracked project with markdown files — hoard.md (knowledge), chronicles.md (history), quests.md (tasks), protocol.md (AI guidelines). These sync bi-directionally with
the database.
Why? Because markdown files are git-trackable. Your AI's memory about a project lives in that project's repository. Team members can read and contribute to the AI's context through regular git workflows.
Why Optional Encryption?
Not everyone needs it, but when you're storing architecture decisions, API patterns, or internal tooling details, AES-256-GCM encryption with a local key keeps things locked down. Keys never leave your
machine.
What I Learned Building This
MCP is early but powerful. The protocol is still evolving, but the core idea — giving AI clients a standardized way to use external tools — is exactly right. Building an MCP server feels like building a REST
API, but for AI.
AI agents are better at using tools than I expected. I was worried about tool selection — would the AI call the right Wyrm tool at the right time? Turns out, with clear tool descriptions, modern models are
remarkably good at orchestrating tool calls.
Memory changes the relationship with AI. This is the biggest takeaway. When your AI remembers your decisions, your architecture, your preferences — the conversations shift from "let me explain everything" to
"let's build on what we know." It feels collaborative instead of transactional.
Try It
Wyrm is AGPL-3.0 open source. The entire codebase is TypeScript with only 2 runtime dependencies (better-sqlite3 and @modelcontextprotocol/sdk). 79 tests passing, zero known vulnerabilities.
GitHub: github.com/ghosts-lk/Wyrm (https://github.com/ghosts-lk/Wyrm) Docs: ghosts.lk/wyrm/docs (https://ghosts.lk/wyrm/docs) Landing page: ghosts.lk/wyrm (https://ghosts.lk/wyrm)
If you find it useful, a ⭐ on GitHub helps other developers discover it.
Built by Ghost Protocol (https://ghosts.lk) in Colombo, Sri Lanka.
The dragon remembers. 🐉