Context Transporter
MCP server for seamless context transfer between AI conversation threads. Built with FastMCP.
Ask AI about Context Transporter
Powered by Claude Β· Grounded in docs
I know everything about Context Transporter. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Context Transporter MCP π
A Model Context Protocol (MCP) server built with FastMCP for seamless context transfer between conversation threads.
Features
- Extract Session Context: Pull and summarize conversation history from existing sessions
- Seed New Sessions: Create fresh conversation threads with injected context from previous sessions
- Smart Context Filtering: Extract only relevant messages using semantic similarity and importance scoring
- LRU Caching: Automatic caching of frequently accessed sessions for better performance
- Relevance Scoring: Filter out irrelevant information with configurable weights
- Docker Support: Run as HTTP server in Docker container
- Async Throughout: Built with async/await for optimal performance
- Type-Safe: Full type hints for better IDE support and code quality
Quick Start
1. Installation
# Clone the repository
git clone https://github.com/yourusername/context-transporter-mcp.git
cd context-transporter-mcp
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Basic installation
pip install -e .
# With semantic embeddings (recommended for better relevance scoring)
pip install -e ".[embeddings]"
2. Run the Server
python src/context_transporter.py
3. Configure Your MCP Client
Add to your MCP client configuration (e.g., Claude Desktop, Cursor, Augment):
{
"mcpServers": {
"context-transporter": {
"command": "python",
"args": ["/path/to/context-transporter-mcp/src/context_transporter.py"]
}
}
}
Config file locations:
- Claude Desktop:
~/.config/claude/claude_desktop_config.json(macOS/Linux) or%APPDATA%\Claude\claude_desktop_config.json(Windows) - Augment:
%APPDATA%\Augment\mcp_config.json(Windows) or~/Library/Application Support/Augment/mcp_config.json(macOS) - Cursor: Check Cursor's MCP settings
Available Tools
1. extract_session_context
Save and extract context from the current conversation.
Save current conversation:
result = await extract_session_context(
messages=[
{"role": "user", "content": "How do I use FastMCP?"},
{"role": "assistant", "content": "FastMCP is a Python framework..."}
],
title="FastMCP Discussion"
)
# Returns: session_id to use later
Load existing session:
result = await extract_session_context(session_id="session-abc123")
2. seed_context
Load context from a previous session into a new thread.
# In a fresh thread, load context from previous session
result = await seed_context(session_id="session-abc123")
# The AI now has full context from the previous conversation!
Context modes:
"full"(default): Complete conversation transcript"summary": Just the conversation summary"key_points": Just the key takeaways
3. extract_relevant_context
Extract only the most relevant messages based on a query.
result = await extract_relevant_context(
session_id="session-123",
query="authentication login",
max_messages=10,
recency_weight=0.3, # Weight for recent messages
importance_weight=0.2, # Weight for important content
similarity_weight=0.5 # Weight for query similarity
)
4. list_sessions
List all available saved sessions.
result = await list_sessions()
5. clear_cache
Clear the session cache to free memory.
result = await clear_cache()
Workflow Example
Thread A (Original Conversation):
User: "Save this conversation for later"
β extract_session_context(messages=[...], title="My Discussion")
β Returns: session_id = "session-abc123"
Thread B (New Fresh Thread):
User: "Load context from session-abc123"
β seed_context(session_id="session-abc123")
β AI now has full context from Thread A!
Relevance Scoring
The relevance scoring system uses three factors:
1. Recency Score
- Exponential decay:
score = e^(-0.1 * position_from_end) - Recent messages score higher
2. Importance Score
Detects important signals:
- Questions (contains
?): +0.2 - Code blocks (contains
```): +0.15 - Decision keywords (implement, fix, bug): +0.1
- Long messages (>200 chars): +0.05
3. Similarity Score
- With embeddings: Cosine similarity between message and query
- Without embeddings: Jaccard similarity (keyword overlap)
Weight Guidelines:
- High recency_weight (0.5-0.7): For ongoing conversations
- High importance_weight (0.5-0.7): For decision-heavy discussions
- High similarity_weight (0.5-0.7): For topic-specific extraction
Docker Setup (HTTP Transport)
Run as an HTTP server in Docker for better stability.
Build and Run
# Build
docker build -t context-transporter-mcp:latest .
# Run with Docker Compose
docker-compose up -d
# Or run directly
docker run -d --name context-transporter-mcp -p 8090:8090 \
-v ./sessions:/app/sessions context-transporter-mcp:latest
Configure MCP Client for HTTP
{
"mcpServers": {
"context-transporter": {
"url": "http://localhost:8090/sse",
"type": "http"
}
}
}
Advantages of HTTP Transport
- β No Windows async issues (runs in Linux container)
- β Better debugging (test with curl)
- β Persistent server (faster responses)
- β Multiple clients can connect
Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MCP Client (AI Assistant) β
ββββββββββββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Context Transporter MCP Server β
β β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Tool: extract_relevant_context β β
β β - Query-based filtering β β
β β - Relevance scoring β β
β β - Smart message selection β β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β β
β βΌ β
β ββββββββββββββββββββ ββββββββββββββββββββ β
β β Context Cache β β Relevance Scorer β β
β β - LRU eviction β β - Recency β β
β β - Fast access β β - Importance β β
β β - Statistics β β - Similarity β β
β ββββββββββββββββββββ ββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββ
β Session Storage β
β sessions/*.json β
βββββββββββββββββββββββ
File Structure
context-transporter-mcp/
βββ src/
β βββ context_transporter.py # Main MCP server
β βββ context_cache.py # LRU caching
β βββ relevance_scorer.py # Relevance scoring
βββ sessions/ # Session storage (gitignored)
βββ tests/ # Test files
βββ examples/ # Usage examples
βββ Dockerfile # Docker support
βββ docker-compose.yml # Docker Compose config
βββ pyproject.toml # Python package config
Development
Running Tests
# Install dev dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run standalone test
python test_caching_standalone.py
Code Formatting
black src/
ruff check src/
Use Cases
- Continue conversations across different chat interfaces
- Transfer context between different AI assistants
- Archive and resume long-running discussions
- Share conversation context with team members
- Migrate conversations between platforms
- Extract relevant information without context bloat
- Filter conversations by topic or importance
Troubleshooting
Server won't start
- Make sure FastMCP is installed:
pip install fastmcp - Check Python version: requires Python 3.10+
Sessions not persisting
- Check that the
sessions/directory exists and is writable
Embeddings not working
pip install sentence-transformers numpy
Cache issues
# Check cache stats
await get_cache_stats()
# Clear cache
await clear_cache()
Built With
- FastMCP - The fast, Pythonic way to build MCP servers
- Python 3.10+ with async/await
- Optional: sentence-transformers for semantic embeddings
License
MIT
