Multicli
MCP server for Multiple CLI coding agent integration (Claude + Gemini + Codex)
Ask AI about Multicli
Powered by Claude Β· Grounded in docs
I know everything about Multicli. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Multi-CLI MCP
An MCP server that lets Claude, Gemini, Codex, and OpenCode call each other as tools.
Claude: "Hey Gemini, what do you think about this code?"
Gemini: "It's mass. Let me ask Codex for a second opinion."
Codex: "You're both wrong. Here's the fix."
OpenCode: "I checked with three providers. They all agree with Codex."
One-Line Install (macOS / Linux)
curl -fsSL https://raw.githubusercontent.com/osanoai/multicli/main/install.sh | bash
Detects which AI CLIs you have installed and configures Multi-CLI for them automatically.
- Claude Code is configured to use a per-user local HTTP service on
127.0.0.1 - Gemini CLI, Codex CLI, and OpenCode keep using stdio/local config by default
- The installer may update client config files on your behalf
What It Does
Multi-CLI sits between your AI clients and bridges them via the Model Context Protocol. Install it once, and whichever AI you're talking to gains the ability to call the others.
- Claude can ask Gemini, Codex, or OpenCode for help
- Gemini can delegate to Claude, Codex, or OpenCode
- Codex can consult Claude, Gemini, or OpenCode
- OpenCode can call Claude, Gemini, or Codex (across 75+ providers)
- Each client's own tools are hidden (no talking to yourself, that's weird)
- Auto-detects which CLIs you have installed β only shows what's available
The Meta Part
This tool was built by the very AIs it connects.
Claude, Gemini, Codex, and OpenCode wrote the code. Claude, Gemini, Codex, and OpenCode maintain it. Every night, a CI job queries the latest stable release of each CLI for its current model list, diffs the results against what's in the repo, and automatically publishes a new version if anything changed. New model releases get picked up within 24 hours. Deprecated models get cleaned out. The repo stays current without anyone touching it.
The stdio install paths use @latest, so those clients pick up new releases automatically. Claude Code's managed HTTP service intentionally runs from a stable installed runtime instead of a transient npx cache; rerun the installer or multicli service refresh --configure-claude after upgrading.
Most MCP tools go stale within weeks. This one is self-maintaining by design.
Prerequisites
You need Node.js >= 20 and at least one of these CLIs installed:
| CLI | Install |
|---|---|
| Claude Code | npm install -g @anthropic-ai/claude-code |
| Gemini CLI | npm install -g @google/gemini-cli |
| Codex CLI | npm install -g @openai/codex |
| OpenCode | `curl -fsSL https://opencode.ai/install |
Multi-CLI is most useful with two or more CLIs installed. With only one, the install still works, but there may be nothing to bridge yet.
Manual Installation
Prefer to install per-client yourself? Each command is one line.
Claude Code
Recommended for Claude Code stability:
npm install -g @osanoai/multicli
multicli service install --configure-claude
This installs a per-user background service that listens only on 127.0.0.1, configures Claude Code to use Multi-CLI over HTTP, and keeps the underlying server alive even if Claude drops a stdio subprocess.
The managed service installs as a per-user login/background service:
- macOS:
launchdLaunchAgent - Ubuntu/Debian:
systemd --user - Windows: Scheduled Task at user logon
Security model:
- Listens only on
127.0.0.1 - Uses
Authorization: Bearer <token>for every HTTP MCP request - Stores generated service artifacts in a user-owned service directory
Operational notes:
multicli service installrefuses transientnpxruntimes; use a global install or a stable local checkout- Generated service state includes an env file you can edit if your local AI CLIs require additional credentials or PATH entries
- On Linux, the default
systemd --userservice follows your login session; enablelingeryourself only if you intentionally want it to stay up after logout
Legacy stdio installation is still available:
claude mcp add --scope user Multi-CLI -- npx -y @osanoai/multicli@latest
Claude Desktop (JSON config)
Add to ~/Library/Application Support/Claude/claude_desktop_config.json on macOS:
{
"mcpServers": {
"Multi-CLI": {
"command": "npx",
"args": ["-y", "@osanoai/multicli@latest"]
}
}
}
Restart Claude Desktop completely after saving.
Gemini CLI
gemini mcp add --scope user Multi-CLI npx -y @osanoai/multicli@latest
Manual config (~/.gemini/settings.json)
{
"mcpServers": {
"Multi-CLI": {
"command": "npx",
"args": ["-y", "@osanoai/multicli@latest"]
}
}
}
Codex CLI
codex mcp add Multi-CLI -- npx -y @osanoai/multicli@latest
Manual config (~/.codex/config.toml) or pass --mcp-config
codex --mcp-config mcp.json
Where mcp.json contains:
{
"mcpServers": {
"Multi-CLI": {
"command": "npx",
"args": ["-y", "@osanoai/multicli@latest"]
}
}
}
OpenCode
OpenCode's mcp add command is interactive, so add Multi-CLI to ~/.config/opencode/opencode.json directly:
{
"mcp": {
"Multi-CLI": {
"type": "local",
"command": ["npx", "-y", "@osanoai/multicli@latest"]
}
}
}
If the file already exists, merge the "Multi-CLI" entry into the existing "mcp" object.
Any Other MCP Client
Multi-CLI supports both stdio and Streamable HTTP.
For stdio-capable clients, point them at:
npx -y @osanoai/multicli@latest
For a managed local HTTP service, install Multi-CLI globally and run:
multicli service install
The service listens on 127.0.0.1 only and exposes /mcp plus /health.
Available Tools
Once connected, your AI client gains access to tools for the other CLIs (never its own):
| Tool | Description |
|---|---|
List-Gemini-Models | List available Gemini models and their strengths |
Ask-Gemini | Ask-Gemini a question or give it a task |
Fetch-Chunk | Retrieve chunked responses from Gemini |
Gemini-Help | Get Gemini CLI help info |
List-Codex-Models | List available Codex models |
Ask-Codex | Ask-Codex a question or give it a task |
Codex-Help | Get Codex CLI help info |
List-Claude-Models | List available Claude models |
Ask-Claude | Ask-Claude a question or give it a task |
Claude-Help | Get Claude Code CLI help info |
List-OpenCode-Models | List available OpenCode models from all configured providers |
Ask-OpenCode | Ask-OpenCode a question or give it a task |
OpenCode-Help | Get OpenCode CLI help info |
Task-Capable Ask Tools
The Ask-* tools still work as normal synchronous MCP tools, but they now also advertise optional task-based execution for MCP clients that support tasks.
- Task-capable clients can run long
Ask-*calls using MCP tasks to avoid long blocking tool requests - Older clients keep using the same
Ask-*tools synchronously with no config changes List-*,*-Help, andFetch-Chunkremain normal synchronous tools
Usage Examples
Once installed, just talk naturally to your AI:
"Ask-Gemini what it thinks about this architecture"
"Have Codex review this function for performance issues"
"Get Claude's opinion on this error message"
"Use OpenCode to get a second opinion from Llama"
Or get a second opinion on anything:
"I want three perspectives on how to refactor this module β
ask Gemini and Codex what they'd do differently"
How It Works
βββββββββββββββ MCP (stdio) ββββββββββββββββ CLI calls βββββββββββββββ
β Your AI β ββββββββββββββββββββΊ β Multi-CLI β ββββββββββββββββΊ β Other AIs β
β Client β β server β β (CLI tools) β
βββββββββββββββ ββββββββββββββββ βββββββββββββββ
1. Your AI client connects to Multi-CLI via MCP
2. Multi-CLI detects which CLIs are installed on your system
3. It registers tools for the OTHER clients (hides tools for the calling client)
4. When a tool is called, Multi-CLI executes the corresponding CLI command
5. Results flow back through MCP to your AI client
For Claude Code, Multi-CLI can also run as a local background HTTP service:
βββββββββββββββ MCP (HTTP) ββββββββββββββββ CLI calls βββββββββββββββ
β Claude Code β ββββββββββββββββββββΊ β Multi-CLI β ββββββββββββββββΊ β Other AIs β
β Client β 127.0.0.1 only β service β β (CLI tools) β
βββββββββββββββ ββββββββββββββββ βββββββββββββββ
Troubleshooting
"No usable AI CLIs detected" Make sure at least one other CLI is installed and on your PATH:
which gemini && which codex && which claude && which opencode
No tools showing up? If only your own CLI is installed, Multi-CLI hides it (no self-calls). Install a different CLI to enable cross-model collaboration.
MCP server not responding?
- Check that Node.js >= 20 is installed
- Run
npx -y @osanoai/multicli@latestdirectly to see if the stdio server starts - Restart your AI client completely
Claude Code disconnecting after successful calls? Use the managed background service instead of the legacy stdio integration:
npm install -g @osanoai/multicli
multicli service install --configure-claude
multicli service doctor
Helpful service commands:
multicli service statusmulticli service doctormulticli service logsmulticli service refreshmulticli service uninstall
Need to tune timeouts or cleanup behavior? Multi-CLI supports these optional environment variables:
MULTICLI_TRANSPORT(stdioorhttp)MULTICLI_ASK_TIMEOUT_MSMULTICLI_HELP_TIMEOUT_MSMULTICLI_CLI_DETECT_TIMEOUT_MSMULTICLI_KILL_GRACE_MSMULTICLI_HTTP_HOST(defaults to127.0.0.1)MULTICLI_HTTP_PORT(defaults to37420)MULTICLI_HTTP_PATH(defaults to/mcp)MULTICLI_HTTP_AUTH_TOKEN(required for direct HTTP mode)MULTICLI_HTTP_SESSION_IDLE_MSMULTICLI_LOG_PATH(defaults to~/.multicli/logs/multicli.log)MULTICLI_LOG_LEVEL(error,info, ordebug; defaults todebugfor the file log)MULTICLI_STDERR_LOG_LEVEL(silent,error,info, ordebug; defaults toerror)MULTICLI_SERVICE_ROOT_DIRMULTICLI_SERVICE_LOG_PATHMULTICLI_SERVICE_ENV_PATHMULTICLI_SERVICE_MANIFEST_PATH
The server writes structured JSON-line logs to a single file destination, rotates them automatically for long-running service mode, and includes full prompt bodies for Ask-* requests so disconnects and crashes can be reconstructed after the fact.
Development
git clone https://github.com/osanoai/multicli.git
cd multicli
npm install
npm run build
npm run dev
