Endara Relay
Endara Relay β MCP server aggregator. Connects multiple MCP servers behind a single endpoint.
Ask AI about Endara Relay
Powered by Claude Β· Grounded in docs
I know everything about Endara Relay. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Endara Relay
One endpoint for all your MCP servers. endara.ai
Aggregate local and cloud MCP servers behind a single endpoint. Add servers, manage OAuth, connect any AI client β all from one place.
Works with Claude Desktop, ChatGPT, Cursor, Windsurf, VS Code, Zed, Continue, and any MCP-compatible client.
Why?
- One endpoint, not N β point every AI client at
localhost:9400instead of pasting the same MCP server config into each app. - OAuth managed for you β Relay handles token storage and refresh for servers that need it, so your clients don't have to.
- Hot-reload config β edit your TOML, save, and Relay picks up the change without a restart.
- Automatic restart on crash β flaky STDIO servers come back on their own with exponential backoff.
- Fully local β no cloud, no accounts, no telemetry. Everything runs on your machine.
What is this?
Endara Relay is a single Rust binary that sits between your AI assistant (Claude Desktop, Cursor, or any MCP client) and all the MCP servers you use. Instead of configuring each server individually in your client, you point your client at one local endpoint β localhost:9400 β and Relay handles the rest.
It connects to each MCP server using the appropriate transport (STDIO, SSE, or HTTP), merges their tool catalogs into a unified list, and prefixes tool names to avoid collisions. If a server crashes, Relay restarts it automatically. If you edit the config file, Relay picks up the changes without a restart.
No cloud. No accounts. Everything runs on your machine.
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Endara Relay (single Rust process) β
β β
β ββββββββββββββββββββββ βββββββββββββββββββββββββββββ
β β TCP loopback :9400 β β Unix socket / Named pipe ββ
β β /mcp /healthz β β /api/* (per-user, 0600)ββ
β β /oauth/callback β β ββ
β ββββββββββββββββββββββ βββββββββββββββββββββββββββββ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
The TCP loopback listener serves MCP traffic, the health probe, and the OAuth callback. The management API (/api/*) is bound exclusively to a per-user OS-local Unix-domain socket (Linux/macOS) or Named Pipe (Windows) with 0600 permissions; it is not reachable over TCP. See Management API for the full endpoint list and the platform-specific socket paths.
Quick Start
1. Install
# Homebrew β recommended (macOS / Linux)
brew install endara-ai/tap/endara-relay
# Or, with cargo:
cargo install endara-relay
# Or download a pre-built binary from GitHub Releases:
# https://github.com/endara-ai/endara-relay/releases
2. Create a config file
mkdir -p ~/.endara
cat > ~/.endara/config.toml << 'EOF'
[relay]
machine_name = "my-laptop"
[[endpoints]]
name = "filesystem"
transport = "stdio"
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/Users/me/projects"]
[[endpoints]]
name = "github"
transport = "stdio"
command = "npx"
args = ["-y", "@modelcontextprotocol/server-github"]
env = { GITHUB_TOKEN = "$GITHUB_TOKEN" }
EOF
3. Run
endara-relay --config ~/.endara/config.toml
4. Connect your MCP client
Point Claude Desktop (or any MCP client) to http://localhost:9400/mcp. You'll see tools from all configured endpoints in a single list, prefixed with the endpoint name:
filesystem__read_filefilesystem__write_filegithub__list_reposgithub__create_issue
Configuration
The config file is TOML. Here's a complete reference:
[relay]
machine_name = "my-laptop" # Required β identifies this machine
local_js_execution = true # Optional β enable JS execution mode (default: false)
# STDIO endpoint β spawns a child process
[[endpoints]]
name = "github" # Required β unique name, used as tool prefix
transport = "stdio" # Required β "stdio", "sse", or "http"
command = "npx" # Required for stdio β command to run
args = ["-y", "@modelcontextprotocol/server-github"] # Optional β command arguments
env = { GITHUB_TOKEN = "$GITHUB_TOKEN" } # Optional β environment variables
# SSE endpoint β connects to a Server-Sent Events MCP server
[[endpoints]]
name = "remote-server"
transport = "sse"
url = "http://localhost:3001/sse" # Required for sse/http β server URL
# HTTP endpoint β connects via JSON-RPC over HTTP
[[endpoints]]
name = "http-server"
transport = "http"
url = "http://localhost:4000/mcp" # Required for sse/http β server URL
Environment variable resolution
Environment variables in env maps are resolved at startup:
| Syntax | Behavior |
|---|---|
$VAR | Replaced with the value of VAR from the process environment |
$$VAR | Literal string $VAR (escape with double $) |
plain | Kept as-is |
Validation rules
- At least one endpoint must be configured
- Endpoint names must be unique and non-empty
stdiotransport requires acommandfieldsseandhttptransports require aurlfield
Features
Multi-transport adapters
Connect to any MCP server regardless of how it communicates:
- STDIO β Spawns a child process and communicates over stdin/stdout. Ideal for local CLI-based MCP servers like the official
@modelcontextprotocol/server-*packages. - SSE β Connects to a remote server using HTTP + Server-Sent Events. Good for servers that push updates.
- HTTP β Standard JSON-RPC 2.0 over HTTP POST. The simplest remote transport.
Tool prefixing
Every tool is automatically prefixed with its endpoint name to prevent collisions. If endpoint github exposes a tool called list_repos, it becomes github__list_repos in the merged catalog. This means you can connect multiple servers that expose identically-named tools without conflicts.
Config hot-reload
Relay watches your config file for changes using the notify crate. When you save the file, Relay automatically:
- Starts adapters for newly added endpoints
- Stops adapters for removed endpoints
- Restarts adapters for changed endpoints
- Leaves unchanged endpoints running
No restart required.
Crash recovery
If a STDIO server process crashes, Relay automatically restarts it with exponential backoff. After repeated failures, the endpoint is marked unhealthy. This keeps your tool catalog available even when individual servers are flaky.
JS execution mode
When local_js_execution = true, Relay replaces the full tool catalog with three meta-tools:
| Meta-tool | Description |
|---|---|
list_tools | List all available tools across all endpoints |
search_tools | Search tools by name or description |
execute_tools | Run a JavaScript script that can call any tool |
This dramatically reduces context window pollution. Instead of exposing hundreds of tools to the AI, it sees only three. The AI writes short JS scripts to discover and call the tools it needs.
Example: the AI calls execute_tools with:
const repos = await call("github__list_repos", { org: "endara-ai" });
const issues = await call("github__list_issues", { repo: repos[0].name });
return { repos: repos.length, firstRepoIssues: issues };
The JS sandbox is powered by boa_engine and runs entirely in-process β no external runtime needed.
Management API
Relay exposes a management REST API for monitoring and control. The API is reachable only through an OS-local Unix-domain socket (Linux/macOS) or Named Pipe (Windows) created per-user with 0600 permissions; it is not bound to TCP.
| Platform | Path |
|---|---|
| Linux | $XDG_RUNTIME_DIR/endara-relay/api.sock (fallback: <data-dir>/api.sock) |
| macOS | $TMPDIR/endara-relay-<uid>/api.sock |
| Windows | \\.\pipe\endara-relay-<session-id> |
| Method | Endpoint | Description |
|---|---|---|
GET | /api/status | Relay status, uptime, endpoint/health counts |
GET | /api/endpoints | List all endpoints with health and transport info |
GET | /api/endpoints/:name/tools | List tools for a specific endpoint |
GET | /api/endpoints/:name/logs | View stderr logs for a STDIO endpoint |
POST | /api/endpoints/:name/restart | Restart a specific endpoint |
POST | /api/endpoints/:name/refresh | Re-fetch the tool catalog for an endpoint |
GET | /api/config | View current config (env values redacted) |
POST | /api/config/reload | Trigger a config reload |
Example (Linux/macOS):
curl --unix-socket "$XDG_RUNTIME_DIR/endara-relay/api.sock" http://localhost/api/status
Building from Source
Prerequisites
- Rust (stable, 2021 edition)
- macOS, Linux, or Windows
Build
git clone https://github.com/endara-ai/endara-relay.git
cd endara-relay
cargo build --release
The binary will be at target/release/endara-relay.
Run tests
# Unit tests
cargo test
# All tests (including integration tests)
cargo test --all-targets
Releasing
Releases are automated via GitHub Actions. To create a new release:
- Tag the commit:
git tag v0.1.0 && git push origin v0.1.0 - The release workflow automatically:
- Builds release binaries for all platforms (Linux x86_64/aarch64, macOS x86_64/aarch64, Windows x86_64)
- Creates a GitHub Release with the tag
- Uploads platform binaries as release assets
Binary naming convention: endara-relay-{target_triple} (e.g. endara-relay-aarch64-apple-darwin, endara-relay-x86_64-pc-windows-msvc.exe)
The Endara Desktop release workflow downloads these binaries to bundle as a Tauri sidecar.
CI
On every push and PR, the CI workflow runs:
cargo fmt --checkβ formattingcargo clippy -- -D warningsβ lintingcargo testβ unit testscargo test --test '*'β integration tests- Cross-platform build matrix (Linux, macOS, Windows)
Desktop App
Prefer a UI to running a binary from a terminal? Endara Desktop bundles Relay as a sidecar and adds an endpoint dashboard, log viewer, and one-click OAuth flows. It installs from the same Homebrew tap (brew install --cask endara-ai/tap/endara) and is built on top of this repo. More at endara.ai.
Security
The relay's threat model β including its trust boundaries, the management API's UDS/Named-Pipe isolation, and the OAuth callback's localhost-only CSRF protections β is documented in THREAT_MODEL.md. Report security issues by opening a private security advisory on GitHub.
Contributing
Contributions are welcome! Here's how to get started:
- Fork the repository
- Create a branch for your feature or fix (
git checkout -b my-feature) - Make your changes and ensure tests pass (
cargo test) - Run formatting and lints (
cargo fmt && cargo clippy) - Submit a pull request with a clear description of your changes
Please open an issue first for large changes or new features so we can discuss the approach.
Links
- Website: endara.ai
- Desktop app: endara-ai/endara-desktop
- Releases: github.com/endara-ai/endara-relay/releases
License
Licensed under the Apache License, Version 2.0.
Copyright 2025β2026 Endara AI
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
