📦
Ollama Almasrv MCP
MCP server for accessing Ollama models on a remote GPU server (RTX PRO 6000 96GB) via Model Context Protocol. 10 models: 4 local + 6 cloud. Tools: chat, think, embed, similarity, health, models.
0 installs
Trust: 56 — Fair
Ai
Installation
npx ollama-almasrv-mcpAsk AI about Ollama Almasrv MCP
Powered by Claude · Grounded in docs
I know everything about Ollama Almasrv MCP. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Loading tools...
Reviews
Documentation
No README available
