🏥
io.github.daedalus/mcp-external-memory
An MCP server that gives LLMs persistent, searchable semantic memory
0 installs
Trust: 37 — Low
Healthcare
Ask AI about io.github.daedalus/mcp-external-memory
Powered by Claude · Grounded in docs
I know everything about io.github.daedalus/mcp-external-memory. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Loading tools...
Reviews
Documentation
mcp-external-memory
An MCP server that gives LLMs persistent, searchable semantic memory.
Install
pip install mcp-external-memory
Usage
from mcp_external_memory import memory_store, memory_search
# Store a memory
result = memory_store(content="Alice prefers dark mode", namespace="users", tags=["alice", "ui"])
# Search memories
results = memory_search(query="what does Alice prefer?", namespace="users")
CLI
mcp-external-memory --help
API
Tools
| Tool | Description |
|---|---|
memory_store | Persist text + optional namespace/tags/metadata |
memory_search | Semantic search (cosine similarity) over all memories |
memory_get | Retrieve a single memory by ID |
memory_delete | Delete a memory by ID |
memory_list | List memories with optional namespace/tag filter + pagination |
memory_stats | Count of memories, namespaces, DB path |
memory_update | Update an existing memory |
Embedding Backends
The server supports multiple embedding backends:
- TF-IDF (default): Pure Python, no external dependencies
- OpenAI: Uses
text-embedding-3-smallmodel - Ollama: Local embeddings with Ollama
Set via MEMORY_EMBED_BACKEND environment variable.
Development
git clone https://github.com/daedalus/mcp-external-memory.git
cd mcp-external-memory
pip install -e ".[test]"
# run tests
pytest
# format
ruff format src/ tests/
# lint
ruff check src/ tests/
# type check
mypy src/
MCP Registry
mcp-name: io.github.daedalus/mcp-external-memory
