📦
Tinvo
LLM AI Client based on Blazor. (openai, chatgpt, llama, ollama, onnx, deepseekr1...)
0 installs
Trust: 39 — Low
Ai
Ask AI about Tinvo
Powered by Claude · Grounded in docs
I know everything about Tinvo. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Loading tools...
Reviews
Documentation
Tinvo
A powerful, cross-platform, and open-source LLM AI client.
Overview
Tinvo is designed to be a comprehensive and extensible client for various AI models, supporting both cloud-based APIs and local inference engines. Its cross-platform nature allows you to run it seamlessly on desktops, mobile devices, and even directly in your web browser.
Features
-
Multi-Provider Support:
- OpenAI
- iFlytek (Xunfei)
- ONNX
- Ollama
- Llama Models
- Model Context Protocol (MCP) Tools Call
-
Cross-Platform Compatibility:
- Android
- iOS
- Windows
- macOS
- Linux
- Web Server
- WebAssembly (WASM)
-
Storage Support:
- Local
- WebDAV
Screenshots

License
This project is licensed under the MIT License.
