Anet MCP Server
π A high-performance Rust server for Model Control Protocol (MCP) with JSON-RPC 2.0, NATS messaging, async support, and pluggable AI tools.
Ask AI about Anet MCP Server
Powered by Claude Β· Grounded in docs
I know everything about Anet MCP Server. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Anet MCP Server
A Rust implementation of the Model Control Protocol (MCP) server that enables communication between clients and AI models via a standardized protocol.
This project provides a scalable and asynchronous framework for building AI services using Rust, Tokio, and NATS. It is designed for developers building AI agent systems, LLM-based tools, or custom JSON-RPC 2.0 service layers. The architecture supports real-time message passing, making it ideal for microservices, AI orchestration, and tool-based model interaction.
Features
- β JSON-RPC 2.0 compatible API
- π NATS transport layer for message passing
- π οΈ Extensible tool system
- π§ Support for prompts and resources
- β‘ Asynchronous request handling with Tokio
Requirements
- Rust 1.70+
- NATS server running locally or accessible via network
Installation
Add the following to your Cargo.toml:
[dependencies]
anet_mcp_server = "0.1.0"
Getting Started
Running the Example Server
The repository includes a basic example server that demonstrates core functionality:
# Start a NATS server in another terminal or ensure one is already running
# Example:
nats-server
# Run the example server
cargo run --example basic_server
Testing the Server
You can test the server using the included test client:
cargo run --example test_client
This will send various requests to the server and print the responses.
Usage
Creating a Server
use anet_mcp_server::{
ServerBuilder, ServerCapabilities,
transport::nats::NatsTransport,
};
use serde_json::json;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let transport = NatsTransport::new("nats://localhost:4222", "mcp.requests").await?;
let server = ServerBuilder::new()
.transport(transport)
.name("my-mcp-server")
.version("0.1.0")
.capabilities(ServerCapabilities {
tools: Some(json!({})),
prompts: Some(json!({})),
resources: Some(json!({})),
notification_options: None,
experimental_capabilities: None,
})
.build()?;
server.run().await
}
Implementing a Custom Tool
use anet_mcp_server::{Content, Tool};
use async_trait::async_trait;
use serde_json::{json, Value};
struct MyTool;
#[async_trait]
impl Tool for MyTool {
fn name(&self) -> String {
"my_tool".to_string()
}
fn description(&self) -> String {
"A custom tool".to_string()
}
fn input_schema(&self) -> Value {
json!({
"type": "object",
"properties": {
"input": { "type": "string" }
}
})
}
async fn call(&self, input: Option<Value>) -> anyhow::Result<Vec<Content>> {
Ok(vec![Content::Text {
text: "Tool response".to_string()
}])
}
}
API Reference
The server implements the following JSON-RPC methods:
initializeβ Initialize the connection and get server informationlistToolsβ Get a list of available toolscallToolβ Call a specific tool with argumentslistResourcesβ Get a list of available resourcesreadResourceβ Read a specific resourcelistPromptsβ Get a list of available promptsgetPromptβ Get a specific prompt with arguments
Architecture
The server follows a modular design:
- server β Core server logic and request handling
- transport β Message transport layer (currently NATS)
- tools β Tool interfaces and implementations
- types β Common data structures
License
MIT License
Let me know if you want badges, contribution guidelines, or example JSON-RPC payloads added to the README as well.
