Valve
MCP client/server framework from scratch
Ask AI about Valve
Powered by Claude · Grounded in docs
I know everything about Valve. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
valve
Python implementation of the Model Context Protocol (MCP) for building servers that expose tools, resources, and prompts to LLM applications.
Project Purpose
Valve provides a framework for creating MCP-compliant servers and integrating them into LLM applications. It handles protocol negotiation, message routing, transport layer abstractions (stdio, direct, SSE), and provides a host orchestration layer that manages multiple MCP servers simultaneously. The framework enables LLMs to interact with external data sources and tools through a standardized protocol.
Architecture Overview
- MCPLite: Server-side framework for defining MCP servers using decorators. Provides
@tool,@resource, and@promptdecorators for registering capabilities. - Host: Orchestration engine that manages multiple MCP clients, aggregates capabilities, generates system prompts, and implements agent loops for LLM interactions.
- Client: MCP client implementation that connects to servers, performs capability negotiation, and sends requests.
- Server: Message processor that routes incoming MCP requests to registered primitives and returns responses.
- Transport Layer: Abstractions for communication including DirectTransport (in-process), StdioTransport (subprocess via stdio), and SSETransport (HTTP SSE).
- Primitives: Core MCP abstractions including MCPTool, MCPResource, MCPResourceTemplate, and MCPPrompt that wrap Python functions with MCP metadata.
- Messages: Pydantic models for MCP protocol messages including requests, responses, notifications, and errors conforming to JSON-RPC 2.0.
- Registry: ServerRegistry holds server-side primitives with executable code; ClientRegistry holds client-side definitions for capability discovery.
- Inventory: Server discovery and management system that scans directories for available MCP servers and maintains metadata.
- MCPChat: Chat interface built on Chain framework that adds MCP capabilities to conversational LLM applications.
Dependencies
Major Dependencies:
pydantic: Data validation and serializationrequests: HTTP client for fetch serverbeautifulsoup4: HTML parsingmarkdownify: HTML to Markdown conversionfastapi: Web framework for SSE transportsse-starlette: SSE support for FastAPIaiohttp: Async HTTP clientrich: Terminal formatting and output
Local Dependencies:
Chain: LLM framework providing Model, Message, MessageStore, Prompt, and Chat classes (appears to be an internal dependency)
API Documentation
MCPLite
class MCPLite:
def __init__(self, transport: Optional[Transport | str] = None)
Main class for creating MCP servers.
Key Methods:
def tool(self, func: Callable) -> Callable
Decorator to register a function as an MCP tool. Function must have type annotations and docstring.
def resource(self, uri: str, mime_type: str = "text/plain", size: int = 1024) -> Callable
Decorator to register a function as an MCP resource or resource template. Use {param} in URI for templates.
def prompt(self, func: Callable) -> Callable
Decorator to register a function as an MCP prompt. Function should return a string or list of PromptMessage objects.
def run(self)
Start the server. Behavior depends on transport type.
Host
class Host:
def __init__(
self,
servers: list[str],
model: str = "gpt",
preferred_transport: transport_types = "stdio",
console: Console = Console()
)
Orchestration engine for managing multiple MCP servers.
Parameters:
servers: List of server names to connect tomodel: Model identifier for LLMpreferred_transport: Transport type preference ("stdio", "direct", "sse")console: Rich console for output
Key Methods:
def agent_query(self, prompt: str, message_store: MessageStore = MessageStore()) -> str | None
Execute an agent loop that handles MCP tool calls. Returns final answer or None.
Client
class Client:
def __init__(
self,
name: str = "Generic Client",
transport: str | Transport | StdioClientTransport = "DirectTransport",
server_function: Optional[Callable] = None
)
MCP client for connecting to servers.
Key Methods:
def initialize(self)
Perform MCP handshake and capability discovery.
def send_request(self, request: MCPRequest) -> MCPResult
Send a request to the server and return the result.
def send_notification(self, notification: MCPNotification)
Send a notification to the server (no response expected).
MCPChat
class MCPChat(Chat):
def __init__(
self,
servers: list[str],
model: str = "gpt",
preferred_transport: transport_types = "stdio",
**kwargs
)
Chat interface with MCP capabilities. Inherits from Chain's Chat class.
Parameters:
servers: List of MCP server names to connect tomodel: Model identifierpreferred_transport: Transport type preference
Additional Commands:
/status: Show MCP connection status/list_tools: List available tools/list_resources: List available resources/list_prompts: List available prompts
Transport Classes
class StdioClientTransport(Transport):
def __init__(self, server_command: list[str])
Client transport for subprocess communication via stdio.
class StdioServerTransport(Transport):
def __init__(self)
Server transport for stdio communication.
class DirectTransport(Transport):
def __init__(self, server_function: Callable)
In-process transport that directly calls server functions.
Usage Examples
Creating an MCP Server
from valve.mcplite.mcplite import MCPLite
from valve.transport import StdioServerTransport
mcp = MCPLite(transport=StdioServerTransport())
@mcp.tool
def add_numbers(a: int, b: int) -> int:
"""Add two numbers together."""
return a + b
@mcp.resource(uri="myapp://status")
def get_status() -> str:
"""Get application status."""
return "Server is running"
@mcp.prompt
def analysis_prompt(topic: str) -> str:
"""Generate an analysis prompt for a given topic."""
return f"Please analyze the following topic in detail: {topic}"
if __name__ == "__main__":
mcp.run()
Using Host to Orchestrate Multiple Servers
from valve.host.Host import Host
from Chain import MessageStore
# Connect to multiple MCP servers
host = Host(
model="gpt",
servers=["fetch", "obsidian"],
preferred_transport="stdio"
)
# Execute a query that may use multiple tools
message_store = MessageStore()
result = host.agent_query(
"Fetch the content from example.com and save it to my notes",
message_store
)
print(result)
Building a Chat Application with MCP
from valve.mcpchat.mcpchat import MCPChat
# Create chat with MCP capabilities
chat = MCPChat(
model="gpt",
servers=["fetch", "obsidian"],
preferred_transport="direct"
)
# Start interactive chat
chat.chat()
# Or use programmatically
response = chat.query("What's the weather in San Francisco?")
