๐Ÿ“ฆ
Ollama Server
by Unclaimed devtools
Extends Model Context Protocol (MCP) to local LLMs via Ollama, enabling Claude-like tool use (files, web, email, GitHub, AI images) while keeping data private. Modular Python servers for on-prem AI. #LocalAI #MCP #Ollama
Free
View on GitHub Claim this server
Quick Install
npx mcp-ollama-server

No README available for this server yet.

No reviews yet. Be the first to leave one on the marketplace.