π
Online Search
An MCP (Model Context Protocol) server that provides web search and research capabilities through any OpenAI-compatible LLM with internet access
0 installs
Trust: 34 β Low
Search
Ask AI about Online Search
Powered by Claude Β· Grounded in docs
I know everything about Online Search. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Loading tools...
Reviews
Documentation
MCP Servers
A Rust workspace containing lightweight MCP (Model Context Protocol) servers built with rmcp. All servers use STDIO transport and are optimized for low memory usage.
Servers
| Server | Description |
|---|---|
| mcp-online-search | Web search and research via any OpenAI-compatible LLM (e.g. Perplexity sonar-pro) |
| mcp-sequential-thinking | Chain-of-thought reasoning with branching and revision support |
Build
# Format all code
cargo fmt --all
# Build all servers
cargo build --release
# Build a specific server
cargo build -p mcp-online-search --release
cargo build -p mcp-sequential-thinking --release
# Install a specific server
cargo install --path mcp-online-search
cargo install --path mcp-sequential-thinking
Binaries are output to target/release/.
Quick Start
Each server runs over STDIO β point your MCP client at the binary:
{
"mcpServers": {
"online-search": {
"command": "/path/to/mcp-online-search",
"env": {
"LLM_BASE_URL": "https://api.perplexity.ai",
"LLM_API_KEY": "your-api-key",
"LLM_MODEL": "sonar-pro"
}
},
"sequential-thinking": {
"command": "/path/to/mcp-sequential-thinking"
}
}
}
See each server's README for full configuration details.
License
MIT
