Signallama
lightweight chat interface designed to work seamlessly with your own MCP servers
Ask AI about Signallama
Powered by Claude Β· Grounded in docs
I know everything about Signallama. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Signallama
Welcome to the Signallama, a lightweight and modern chat interface built using SignalR for real-time communication, designed to work seamlessly with your own custom MCP (Model Context Protocol) servers.
π Features
- π Real-time chat powered by SignalR
- π₯οΈ Responsive frontend UI
- π§ Seamless integration with your own MCP servers (example provided)
- ποΈ Modular and extensible architecture
- β Lightweight and easy to deploy
π§° Tech Stack
π¦ Getting Started
Prerequisites
- Ollama
- .NET Core SDK
- MCP Server setup (optional - your own implementation)
Configure your MCP in appsettings.json
Input your SSE Server or Stdio in the section McpSettings:
"McpSettings": {
"Sse": [
{
"Endpoint": "https://localhost:7170",
"UseStreamableHttp": true,
"Name": "MyServer",
"ConnectionTimeout": "00:00:10"
}
]
}
or for example using external project over stdio:
"McpSettings": {
"Sse": [
{
"Endpoint": "https://localhost:7170",
"UseStreamableHttp": true,
"Name": "MyServer",
"ConnectionTimeout": "00:00:10"
}
],
"Stdio" : [
"Name": "MyStuff",
"Command": "npx",
"Arguments": ["-y", "--verbose", "@modelcontextprotocol/server-everything"],
]
}
Start the project
- start your MCP server (or use the provided one signallama.mcp)
- start the frontend with signallama.web
- enjoy your lightweight environment!
π€ Contributing
Contributions are welcome! Please fork the repo and submit a pull request. Feel free to open issues for bugs, suggestions, or features.
π License
This project is licensed under the MIT License.
