Spark
No description available
Ask AI about Spark
Powered by Claude · Grounded in docs
I know everything about Spark. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
MCP-Spark
A Python framework for building Model Context Protocol (MCP) servers.
Overview
MCP-Spark is a modular, extensible, and Docker-ready framework for creating MCP servers that can dynamically load tools, handle client requests, and be configured via YAML files. It enables users to define custom MCP tools, prompts, and resources through plugins or direct implementation.
Features
- Modular and extensible architecture
- Docker containerization support
- Dynamic tool loading
- Client request handling via the MCP protocol
- Configuration via YAML
- Plugin system for custom tools, prompts, and resources
- Support for multiple transport options (stdio, http, sse)
Installation
Using pip
pip install mcp-spark
Using uv
uv venv
uv sync
From source
git clone https://github.com/username/mcp-spark.git
cd mcp-spark
uv venv
uv sync
python -m build
pip install dist/*.whl
Usage
MCP-Spark supports two usage methods:
Direct Source Code Usage
- Download the framework's source code
- Write your own tools in the
pluginsdirectory - Configure the framework using the provided configuration file
- Build a Docker image to deploy the MCP server
Example:
# Clone the repository
git clone https://github.com/username/mcp-spark.git
cd mcp-spark
# Create your custom tools
mkdir -p plugins
# Create your tools in plugins/
# Configure the server
# Edit config/config.yaml
# Build and run with Docker
docker build -t my-mcp-server .
docker run -it my-mcp-server
Framework Package Integration
- Install the mcp-spark package
- Register your custom tools by calling the framework's Python interfaces
- Start the MCP server by invoking
framework.start()
Example:
from mcp_spark.server import MCPServer
from fastmcp import Tool
# Define your custom tool
class MyCustomTool(Tool):
def __init__(self):
super().__init__(
name="my_tool",
description="My custom tool",
# Define schemas...
)
def execute(self, params):
# Implement your tool logic
return {"result": "success"}
# Create server configuration
config = {
"server": {
"name": "MyMCPServer",
"transport": "stdio"
}
}
# Initialize and start the server
server = MCPServer(config_dict=config)
# Register your custom tool
server.tool_manager.register_tool(MyCustomTool())
# Start the server
server.start()
Configuration
MCP-Spark is configured using a YAML file. Here's an example configuration:
server:
name: "MCPSparkServer"
transport: "stdio" # Options: stdio, http, sse
host: "localhost"
port: 8000
log_level: "INFO"
plugins:
directories:
- "plugins"
modules:
- "custom_tools.weather"
tools:
weather:
module: "custom_tools.weather"
class: "WeatherTool"
description: "Get weather information for a location"
prompts:
weather_analysis:
content: "Analyze the following weather data: {{data}}"
resources:
weather_data:
module: "custom_tools.weather"
function: "get_weather_data"
docker:
base_image: "python:3.9-slim"
requirements:
- "fastmcp>=0.1.0"
- "pyyaml>=6.0"
entrypoint: "python -m mcp_spark.main"
Creating Plugins
Plugins are Python modules that provide tools, prompts, or resources for the MCP server. Here's an example of a simple tool plugin:
from typing import Dict, Any
from fastmcp import Tool
class WeatherTool(Tool):
def __init__(self):
super().__init__(
name="weather",
description="Get weather information for a location",
input_schema={
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get weather information for"
}
},
"required": ["location"]
},
output_schema={
"type": "object",
"properties": {
"temperature": {"type": "number"},
"humidity": {"type": "number"},
"conditions": {"type": "string"}
}
}
)
def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
location = params["location"]
# Implement your tool logic here
return {
"temperature": 22.5,
"humidity": 65,
"conditions": "Partly cloudy"
}
Docker Integration
MCP-Spark provides a utility to generate a Dockerfile for your MCP server project:
python -m mcp_spark.util.dockerfile_gen -c config/config.yaml -o Dockerfile
You can then build and run your Docker image:
docker build -t my-mcp-server .
docker run -it my-mcp-server
License
MIT
