📦
Warp Llama.cpp Fastmcp
No description available
0 installs
Trust: 30 — Low
Devtools
Ask AI about Warp Llama.cpp Fastmcp
Powered by Claude · Grounded in docs
I know everything about Warp Llama.cpp Fastmcp. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Loading tools...
Reviews
Documentation
Warp ↔️ llama.cpp via FastMCP
This is a FastMCP-based MCP server that bridges Warp Terminal to your local llama.cpp llama-server over HTTP. Warp invokes tools via stdio; FastMCP handles all MCP boilerplate.
Prereqs
- Debian 12.12
- Python 3.11 at
/usr/local/bin/python3.11 - A running
llama-server(OpenAI-compatible endpoints enabled)
Install
cd /media/waqasm86/External1/Project-Warp/llama-mcp-server-3-fastmcp/
/usr/local/bin/python3.11 -m venv .venv
source .venv/bin/activate
pip install -U pip wheel
pip install -r requirements.txt
