Metis Router
MCP router and Web Based MCP client
Ask AI about Metis Router
Powered by Claude Β· Grounded in docs
I know everything about Metis Router. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Metis - Intelligent MCP Router & Web Based MCP Client
Metis is an advanced AI agent platform with an intelligent MCP (Model Context Protocol) router that dynamically manages up to 1,000+ MCP servers. The router uses an LRU cache system to maintain only the most relevant servers active at any time, preventing context overwhelm while providing access to a vast ecosystem of tools and services. We wanted to create this so you can add additional features while not having to deal with context overload from having over 40 tools enabled! We also provide a modular web based MCP client that anyone can modify! Our goal for this is to help others build cool applications using MCPs.
ποΈ Architecture
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Frontend β β Backend β β MCP Router β
β (Next.js) βββββΊβ (FastAPI) βββββΊβ (Node.js) β
β Port: 3000 β β Port: 8000 β β Port: 9999 β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β β β
β β β
βΌ βΌ βΌ
User Interface AI Agent Logic Intelligent Router
- Chat Interface - OpenAI Agents - 1000+ MCP Servers
- Real-time UI - Session Management - LRU Cache System
- Streaming - Dynamic Loading
- Auto Server Selection
π₯οΈ Demo

π Quick Start
Choose Your Setup Method:
- Shell Scripts (recommended for development): Follow steps below
- Docker (recommended for production): Jump to Docker Setup section
Step 1: Configure Your MCP Servers (Most Important!)
Add the MCP servers you want to server/mcp-registry.json. You can add up to 1,000+ servers:
{
"mcpServers": {
"notion": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.notion.com/sse"]
},
"linear": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.linear.app/sse"]
},
"hyperbrowser": {
"command": "npx",
"args": [
"-y",
"hyperbrowser-mcp"
],
"env": {
"HYPERBROWSER_API_KEY": "API_KEY_HERE"
}
}
}
Step 2: Configure Router Cache Size
Edit server/src/add-new-mcp.ts and modify MAX_ACTIVE_SERVERS to set how many servers stay active simultaneously:
// Only this many servers will be active at once (LRU cache)
const MAX_ACTIVE_SERVERS = 3; // Adjust based on your needs
NOTE: All active MCP servers are in server/config.json, you can manually remove/add servers to the active queue of MCP servers here
Step 3: Set Up Environment Variables
Single shared .env file (project root):
# Copy the template and configure
cp .env.example .env
# Edit .env with your configuration
OPENAI_API_KEY=your_openai_api_key_here
MAX_ACTIVE_SERVERS=3
NODE_ENV=development
Note: The Docker setup uses a single shared .env file in the project root instead of separate .env files in each component directory. This simplifies configuration management and ensures consistency across all services.
Step 4: Authentication & Indexing
# Build all services and authenticate into remote auth servers
./setup.sh
π¨ CRITICAL: All authentication happens during indexing. Any time you:
- Add new servers to
mcp-registry.json - Want to re-authenticate existing servers
- Change server configurations
You MUST rerun the setup script:
./setup.sh
This will:
- Authenticate with all configured servers
- Store credentials in
~/.mcp-auth - Index and embed all servers for AI-powered selection
- Generate the router configuration
Step 5: Start the Application
# Start all services in background
./start.sh
# OR start in separate terminals (recommended for development)
./start.sh -s
Step 6: Use the AI Agent
Open http://localhost:3000 and start chatting!
- Automatic Server Selection: The AI automatically selects the best MCP server/tools for your query
- Manual Server Specification: You can also specify which server to use
- Dynamic Loading: The router loads/unloads servers as needed using LRU cache
- Real-time Tool Discovery: Available tools update dynamically
π§ How the Intelligent Router Works
LRU Cache System
- Registry: Up to 1,000+ servers in
mcp-registry.json - Active Cache: Only
MAX_ACTIVE_SERVERSare loaded simultaneously - Dynamic Loading: Servers are loaded/unloaded based on usage (Least Recently Used)
- Current State:
config.jsonshows currently active servers in the cache
AI-Powered Server Selection
- User asks a question
- AI analyzes the query using embeddings
- Router selects the best MCP server(s) and tools
- If server isn't in cache, it's loaded (LRU eviction if cache full)
- Tools are executed and results returned
Authentication Management
- Storage: Credentials stored in
~/.mcp-auth - Timing: Authentication happens during indexing (
./setup.sh) - Re-auth: Run
./setup.shto re-authenticate or add new servers - Cleanup: Remove credentials with
rm -rf ~/.mcp-auth
π§ Components
1. Intelligent MCP Router (/server)
- Purpose: Manages 1000+ MCP servers with intelligent caching and AI-powered selection
- Tech Stack: Node.js, TypeScript, OpenAI API
- Key Features:
- LRU cache system with configurable
MAX_ACTIVE_SERVERS - AI-powered server selection using semantic embeddings
- Dynamic server loading/unloading
- Centralized authentication management
- Real-time server status tracking
- LRU cache system with configurable
2. AI Agent Backend (/client/backend)
- Purpose: AI agent orchestration and API endpoints
- Tech Stack: FastAPI, Python, OpenAI Agents SDK
- Key Features:
- Session management with streaming responses
- Integration with intelligent MCP router
- Tool execution and response handling
- Real-time communication via SSE
3. Interactive Frontend (/client/frontend)
- Purpose: User interface for the AI agent playground
- Tech Stack: Next.js, React, TypeScript
- Key Features:
- Real-time chat interface with tool visualization
- Server selection transparency
- Streaming response display
- Modern, responsive design
4. Web Based MCP Client
- Auto Refresh Tools
- **Modular, modify how you see fit
π οΈ Advanced Configuration
Router Cache Configuration
Edit server/src/add-new-mcp.ts:
const MAX_ACTIVE_SERVERS = 5; // Increase for more concurrent servers, but there is more context overload
Environment Variables
Shared configuration in root .env file (used by all services):
# Required
OPENAI_API_KEY=your_openai_api_key_here
# Optional (with defaults)
MAX_ACTIVE_SERVERS=3
NODE_ENV=development
SERVER_PORT=9999
BACKEND_PORT=8000
FRONTEND_PORT=3000
Docker-specific variables:
# Service URLs for container networking
SERVER_URL=http://server:9999
NEXT_PUBLIC_API_URL=http://localhost:8000
# Docker network and container names
DOCKER_NETWORK=metis-network
Prerequisites
- Node.js 18+
- Python 3.8+
- OpenAI API Key (required for server selection and embeddings)
π³ Docker Setup (Alternative to Shell Scripts)
Docker provides a containerized environment that ensures consistent deployment across different systems. This section provides comprehensive Docker setup instructions as an alternative to the shell scripts.
Prerequisites
- Docker 20.10+ and Docker Compose 2.0+
- OpenAI API Key (required for server selection and embeddings)
Step 1: Environment Configuration
Create a .env file in the project root using the provided template:
# Copy the environment template
cp .env.example .env
# Edit the .env file with your configuration
# At minimum, set your OpenAI API key:
OPENAI_API_KEY=sk-your_openai_api_key_here
Key Environment Variables:
# Required Configuration
OPENAI_API_KEY=sk-your_openai_api_key_here # Required for AI-powered server selection
MAX_ACTIVE_SERVERS=3 # Number of MCP servers to keep active
NODE_ENV=development # or 'production'
# Service Ports (default values)
SERVER_PORT=9999 # MCP Router port
BACKEND_PORT=8000 # FastAPI backend port
FRONTEND_PORT=3000 # Next.js frontend port
# Service URLs (for Docker networking)
SERVER_URL=http://server:9999 # Backend -> Server connection
NEXT_PUBLIC_API_URL=http://localhost:8000 # Frontend -> Backend connection
Step 2: Configure MCP Servers
Add your desired MCP servers to server/mcp-registry.json:
{
"mcpServers": {
"notion": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.notion.com/sse"]
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token"
}
}
}
}
Step 3: Production Deployment
For production deployment with optimized builds:
# Build and start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Stop services
docker-compose down
Production Features:
- Multi-stage optimized Docker builds
- Automatic service restart policies
- Health checks and monitoring
- Persistent data volumes
- Production-ready configurations
Step 4: Development Workflow
For development with hot reloading and debugging:
# Start development environment
docker-compose -f docker-compose.dev.yml up -d
# View logs for specific service
docker-compose -f docker-compose.dev.yml logs -f server
docker-compose -f docker-compose.dev.yml logs -f backend
docker-compose -f docker-compose.dev.yml logs -f frontend
# Stop development environment
docker-compose -f docker-compose.dev.yml down
Development Features:
- Source code hot reloading for all services
- Debug ports exposed (Node.js: 9229, Python: 5678)
- Development-optimized builds
- Enhanced logging and debugging tools
Step 5: Authentication & Server Indexing
After starting the containers, you need to authenticate and index your MCP servers:
# Option 1: Run authentication inside the server container
docker-compose exec server npm run setup-registry
# Option 2: Use the setup script (if available)
./setup.sh
This will:
- Authenticate with all configured MCP servers
- Store credentials in the
~/.mcp-authvolume - Generate AI embeddings for intelligent server selection
- Create the active server configuration
Volume Management
Docker uses several volume types for data persistence:
Bind Mounts (Direct file access):
volumes:
- ./server/mcp-registry.json:/app/mcp-registry.json # Server registry
- ./server/config.json:/app/config.json # Active server cache
- ./server/generated:/app/generated # AI embeddings
- ~/.mcp-auth:/root/.mcp-auth # Authentication data
Named Volumes (Docker-managed):
volumes:
- metis-server-node-modules:/app/node_modules # Node.js dependencies
- metis-frontend-node-modules:/app/node_modules # Frontend dependencies
Service Architecture
The Docker setup creates a multi-container architecture:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Docker Host β
β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Frontend β β Backend β β MCP Server β β
β β (Next.js) βββββΊβ (FastAPI) βββββΊβ (Node.js) β β
β β Port: 3000 β β Port: 8000 β β Port: 9999 β β
β β nginx:alpine β β python:3.11 β β node:18-alpineβ β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β β β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Docker Network: metis-network β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Health Checks & Monitoring
All services include health checks for monitoring:
# Check service health
docker-compose ps
# View health check logs
docker inspect metis-server --format='{{.State.Health.Status}}'
docker inspect metis-backend --format='{{.State.Health.Status}}'
docker inspect metis-frontend --format='{{.State.Health.Status}}'
Health Check Endpoints:
- Server:
http://localhost:9999/health - Backend:
http://localhost:8000/health - Frontend:
http://localhost:3000/api/health
Docker Commands Reference
Basic Operations:
# Build images
docker-compose build
# Start services (detached)
docker-compose up -d
# View logs
docker-compose logs -f [service_name]
# Stop services
docker-compose down
# Remove volumes (caution: deletes data)
docker-compose down -v
Development Operations:
# Start development environment
docker-compose -f docker-compose.dev.yml up -d
# Rebuild specific service
docker-compose -f docker-compose.dev.yml build server
# Execute commands in running container
docker-compose exec server npm run build
docker-compose exec backend python -c "import app; print('Backend loaded')"
Maintenance Operations:
# View container resource usage
docker stats
# Clean up unused images and containers
docker system prune
# View volume usage
docker volume ls
docker volume inspect metis-server-node-modules
Troubleshooting Docker Issues
Common Issues and Solutions:
-
Port Conflicts
# Check what's using the ports lsof -i :3000 -i :8000 -i :9999 # Change ports in .env file FRONTEND_PORT=3001 BACKEND_PORT=8001 SERVER_PORT=9998 -
Permission Issues with Volumes
# Fix ownership of bind-mounted directories sudo chown -R $USER:$USER ./server/generated sudo chown -R $USER:$USER ~/.mcp-auth -
Container Won't Start
# Check container logs docker-compose logs server # Check if environment variables are loaded docker-compose exec server env | grep OPENAI_API_KEY -
Authentication Failures
# Clear authentication data and re-authenticate docker-compose down rm -rf ~/.mcp-auth docker-compose up -d docker-compose exec server npm run setup-registry -
Build Failures
# Clean build cache and rebuild docker-compose build --no-cache # Remove all containers and volumes docker-compose down -v docker system prune -a -
Service Communication Issues
# Test inter-service connectivity docker-compose exec frontend curl http://backend:8000/health docker-compose exec backend curl http://server:9999/health # Check network configuration docker network inspect metis-network -
Hot Reloading Not Working (Development)
# Ensure proper volume mounting docker-compose -f docker-compose.dev.yml down docker-compose -f docker-compose.dev.yml up -d # Check if source code is properly mounted docker-compose exec server ls -la /app -
Memory/Performance Issues
# Monitor resource usage docker stats # Reduce MAX_ACTIVE_SERVERS in .env MAX_ACTIVE_SERVERS=2 # Restart with new configuration docker-compose restart
Log Analysis:
# View all service logs
docker-compose logs
# Follow logs for specific service
docker-compose logs -f server
# View last 100 lines
docker-compose logs --tail=100 backend
# Filter logs by timestamp
docker-compose logs --since="2024-01-01T00:00:00Z"
Environment Debugging:
# Check environment variables in container
docker-compose exec server printenv | grep -E "(OPENAI|NODE_ENV|SERVER_PORT)"
docker-compose exec backend printenv | grep -E "(OPENAI|PYTHON_ENV|BACKEND_PORT)"
# Verify .env file is loaded
docker-compose config
π οΈ Development
Manual Development Setup
If you prefer to run services individually without Docker:
# Terminal 1: MCP Server
cd server
npm run dev:http
# Terminal 2: Backend
cd client/backend
uvicorn app:app --host localhost --port 8000 --log-level debug
# Terminal 3: Frontend
cd client/frontend
npm start
Key Development Commands
Server:
cd server
npm run build # Build TypeScript
npm run setup-registry # Index servers + generate AI summaries (combined)
npm run dev:http # Start development server
# Individual commands (if needed):
npm run index-servers # Index MCP servers only
npm run generate-ai-summaries # Generate AI summaries only
Backend:
cd client/backend
uvicorn app:app --reload # Start with hot reload
Frontend:
cd client/frontend
npm run build # Production build
npm start # Start production server
π Project Structure
metis-router/
βββ server/ # π§ Intelligent MCP Router
β βββ src/
β β βββ add-new-mcp.ts # π§ Cache config (MAX_ACTIVE_SERVERS)
β β βββ mcp-proxy.ts # π Router proxy server
β β βββ search-mcps.ts # π AI-powered server selection
β β βββ setup-registry.ts # π¦ Combined indexing & embedding
β βββ mcp-registry.json # π ALL servers (up to 1000+)
β βββ config.json # β‘ Currently active servers (cache)
β βββ Dockerfile # π³ Server container configuration
β βββ .dockerignore # π« Docker build exclusions
β βββ generated/ # π€ AI embeddings & summaries
β
βββ client/
β βββ backend/ # π€ AI Agent Backend
β β βββ app.py # π― Main FastAPI application
β β βββ requirements.txt # π Python dependencies
β β βββ Dockerfile # π³ Backend container configuration
β β βββ .dockerignore # π« Docker build exclusions
β β
β βββ frontend/ # π₯οΈ Chat Interface
β βββ app/ # βοΈ Next.js application
β βββ components/ # π¨ UI components
β βββ Dockerfile # π³ Frontend container configuration
β βββ .dockerignore # π« Docker build exclusions
β
βββ ~/.mcp-auth/ # π Authentication credentials (volume)
βββ .env # π Shared environment configuration
βββ .env.example # π Environment template
βββ docker-compose.yml # π³ Production container orchestration
βββ docker-compose.dev.yml # π οΈ Development container orchestration
βββ setup.sh # π Setup + Auth + Indexing
βββ start.sh # βΆοΈ Start all services
βββ README.md # π This file
Key Files:
π§ MAX_ACTIVE_SERVERS: server/src/add-new-mcp.ts or .env
π Server Registry: server/mcp-registry.json
β‘ Active Cache: server/config.json
π Credentials: ~/.mcp-auth/
π³ Docker Config: docker-compose.yml, docker-compose.dev.yml
π Environment: .env (shared by all services)
π Key Features
π§ Intelligent Router
- 1000+ Server Support: Manage massive MCP server ecosystems
- LRU Caching: Smart memory management with configurable cache size
- AI-Powered Selection: Automatic server/tool selection using embeddings
- Dynamic Loading: Servers load/unload based on demand
- Authentication Management: Centralized credential storage and handling
π€ Advanced AI Agent
- Automatic Tool Discovery: AI selects optimal servers and tools for any query
- Real-time Streaming: Live response generation with tool call visualization
- Session Management: Persistent conversations with context preservation
- Flexible Interaction: Use any server or let AI choose automatically
β‘ Performance & Reliability
- Context Optimization: Never overwhelm the AI with too many servers
- Graceful Fallbacks: Robust error handling and recovery
- Hot Reloading: Add servers without system restart
- Monitoring: Real-time server status and performance tracking
π Managing Your MCP Ecosystem
Adding New Servers
- Add to Registry: Edit
server/mcp-registry.json - Reindex & Authenticate: Run
./setup.sh(CRITICAL for auth!) - Automatic Integration: Router will discover and integrate new servers
Re-authentication
# If you need to re-authenticate or add new credentials
./setup.sh
# To completely reset authentication
rm -rf ~/.mcp-auth
./setup.sh
Monitoring Active Servers
Check server/config.json to see which servers are currently loaded in the cache.
π§ Router Configuration Examples
Basic Server Configuration
{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token"
}
},
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/you/Documents"],
"env": {}
}
}
}
Advanced Server Configuration
{
"mcpServers": {
"slack": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-slack"],
"env": {
"SLACK_BOT_TOKEN": "xoxb-your-token"
}
},
"postgres": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-postgres", "postgresql://user:pass@localhost:5432/db"],
"env": {}
},
"custom-server": {
"command": "python",
"args": ["/path/to/your/custom/server.py"],
"env": {
"CUSTOM_API_KEY": "your_key",
"CUSTOM_CONFIG": "value"
}
}
}
}
π Troubleshooting
Shell Script Setup Issues
Authentication Issues:
# If servers fail to authenticate
rm -rf ~/.mcp-auth
./setup.sh
# Check what credentials are stored
ls -la ~/.mcp-auth/
Router Cache Issues:
# Check currently active servers
cat server/config.json
# Restart router to clear cache
cd server && npm run dev:http
Common Issues:
- Authentication Failures: Run
./setup.shafter adding new servers - Server Selection Problems: Ensure OpenAI API key is set in
.envfile - Cache Overflow: Reduce
MAX_ACTIVE_SERVERSin.envorserver/src/add-new-mcp.ts - Port Conflicts: Ensure ports 3000, 8000, and 9999 are available
Docker Setup Issues
Container Issues:
# Check container status
docker-compose ps
# View container logs
docker-compose logs -f [service_name]
# Restart specific service
docker-compose restart [service_name]
Environment Issues:
# Verify environment variables are loaded
docker-compose exec server printenv | grep OPENAI_API_KEY
# Check shared .env file configuration
docker-compose config
Volume and Data Issues:
# Check volume mounts
docker-compose exec server ls -la /app
docker inspect metis-server | grep -A 10 "Mounts"
# Fix permission issues
sudo chown -R $USER:$USER ./server/generated
sudo chown -R $USER:$USER ~/.mcp-auth
Network Issues:
# Test inter-service connectivity
docker-compose exec frontend curl http://backend:8000/health
docker-compose exec backend curl http://server:9999/health
# Check Docker network
docker network inspect metis-network
General Debugging
Logs & Monitoring:
- Shell Scripts: Check individual terminal outputs
- Docker: Use
docker-compose logs -ffor real-time logs - Authentication: Watch for auth failures during setup
- AI Selection: Backend logs show which servers/tools are selected
- Cache Status: Monitor
config.jsonfor active server changes
Performance Issues:
- Memory Usage: Monitor with
docker stats(Docker) or system monitor - Cache Size: Reduce
MAX_ACTIVE_SERVERSin.envfile - Build Performance: Use
docker-compose build --parallelfor faster builds
π License
This project is licensed under the MIT License.
π€ Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
For more information or support, please refer to the individual component README files or open an issue.
