Monkeyrun Open Brain
Open Brain β personal knowledge/memory MCP server for AI agents
Ask AI about Monkeyrun Open Brain
Powered by Claude Β· Grounded in docs
I know everything about Monkeyrun Open Brain. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Open Brain π§
Built on Nate B. Jones's Open Brain architecture. This is MonkeyRun's implementation and extension of his open-source personal memory system β we've added Gmail capture, RAG chunking for long-form content, and documented the journey of building it with AI.
MonkeyRun is a software studio that builds and ships products using AI as the primary development partner β using tools like Cursor, Claude, and OpenClaw to manage real projects end-to-end. This repo is one of those experiments: building a persistent AI memory layer and learning in public.
Open Brain is an MCP server that lets AI agents save, search, and retrieve "thoughts" using semantic vector search. You capture ideas, decisions, emails, and notes β and any AI client you use (Claude, ChatGPT, OpenClaw) can search them by meaning, not just keyword.
Data Sources
| Source | How | Status |
|---|---|---|
| Discord #capture channel | Push β type into channel | β Built-in |
| Gmail (sent + starred) | Pull β run scripts/pull-gmail.ts | β Available |
| ChatGPT export | Bulk import β run scripts/import-chatgpt.py | β Available |
| Manual (any AI client) | capture_thought MCP tool | β Built-in |
ChatGPT Export Importer
Turn your ChatGPT data export into searchable memories. Processes your entire conversation history, filters out low-signal content (quick lookups, one-off creative tasks), and distills lasting knowledge into Open Brain.
# Export your data: chatgpt.com β Settings β Data Controls β Export Data
# Then run:
pip install requests
python scripts/import-chatgpt.py ~/Downloads/chatgpt-export.zip --dry-run --limit 10
python scripts/import-chatgpt.py ~/Downloads/chatgpt-export.zip
What it captures: decisions and reasoning, people with context, project plans, lessons learned, business context, personal values
What it skips: poems, jokes, one-off writing tasks, generic Q&A, coding help with no lasting context
Each ingested thought stores:
- A distilled summary (used for semantic search)
- The original conversation text (retrievable on demand)
- A direct link back to the source (
https://chatgpt.com/c/<id>)
Supports OpenRouter (default, ~$0.15/100 conversations) or local Ollama for zero-cost inference. See scripts/import-chatgpt.py --help for all options.
Guides & Extensions
| Guide | Description |
|---|---|
| Email Capture β Visual Overview | Add Gmail to your persistent memory β pull-based ingestion, RAG chunking, the five things that broke |
| Email Capture β Full Guide | Step-by-step setup, troubleshooting, automation, and security notes |
| Substack Post Draft | Community contribution post for Nate B. Jones's Open Brain series |
Architecture
- Database: Supabase PostgreSQL + pgvector (1536-dimension embeddings,
full_textcolumn for source storage) - Embeddings: OpenRouter β
openai/text-embedding-3-small - Metadata extraction: OpenRouter β
openai/gpt-4o-mini - MCP Server: Deno Edge Function (Supabase) using
@modelcontextprotocol/sdk - Capture: Discord #capture channel β
ingest-thoughtEdge Function
MCP Tools
| Tool | Description |
|---|---|
capture_thought | Save a thought with auto-embedding + LLM-extracted metadata |
search_thoughts | Semantic search β returns source URL and optional full original text |
list_thoughts | Browse recent thoughts with filters (type, topic, person, days) |
thought_stats | Totals, type distribution, top topics, top people |
Deployment
supabase functions deploy open-brain-mcp --no-verify-jwt
supabase functions deploy ingest-thought --no-verify-jwt
Infrastructure
- Supabase Project: Your own Supabase project (see
supabase/config.tomlaftersupabase link) - Auth: Custom key-based (x-brain-key header or ?key= query param)
- JWT: Disabled at function level (self-managed auth)
