Ollama Ocli
π¦ AI coding assistant with self-improvement, LCARS styling, and MCP support
Ask AI about Ollama Ocli
Powered by Claude Β· Grounded in docs
I know everything about Ollama Ocli. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
β οΈ DEPRECATED β This project has been superseded by wiseowl-cli (Java 21 + Micronaut). All features have been consolidated into the new project. This repository is archived and will no longer receive updates.
π¦ OCLI - Ollama Command Line Interface
π¦ OCLI - Ollama Command Line Interface
A Claude Code-like AI coding assistant with self-improvement capabilities, LCARS Star Trek styling, and full terminal UI control.
β¨ Features
π€ AI-Powered Development
- Autonomous tool calling - AI automatically uses tools to read/write files, execute commands
- Streaming responses - Real-time output with progress indicators
- Context-aware - Maintains conversation history and file context
- Self-improving - Can read and modify its own source code
π οΈ Built-in Tools
read_file- Read any file with syntax awarenesswrite_file- Create/modify files with automatic backupsexecute_bash- Run shell commands safelysearch_files- Find files and content across your projectlist_directory- Explore directory structures
π Planning Mode
/plan- Create AI-generated step-by-step plans/next- Execute next step in plan/show-plan- View plan progress
π§ Project Management (WiseOwl)
- Auto-creates
wiseowl/folder for project tracking /todo- Add tasks to TODO.md/done- Mark tasks complete/rule- Add project rules to RULES.md/context- Add context to CONTEXT.md
π¨ LCARS Interface
- Authentic Star Trek LCARS styling
- RGB colors in multiples of 51 (hex 33)
- Status indicators: β (blue=success, red=error, purple=info, yellow=warning)
- Clean vector look with high contrast
π Terminal UI
/monitor- Full-screen real-time statistics (like top/htop)- Cursor positioning and color control
- Alternate buffer support
π MCP (Model Context Protocol) Support
See MCP Architecture Documentation for detailed deployment options.
- Load external MCP servers for extended functionality
/mcp list- Show available MCP tools/mcp call <tool> [params]- Invoke MCP tools- AI automatically knows about available MCP tools
- Configure servers in
.ocli/mcp_servers.json
βοΈ Configuration & Export
/config set <key> <value>- Set configuration/config get <key>- Get configuration value/config list- Show all settings/export [filename]- Export conversation to markdown
π Statistics & Git
/stats- Show session statistics/git status|diff|log|commit- Git integration/version- Show OCLI version
π Installation
From Source
git clone https://github.com/wiseowltechnet/ollama-ocli.git
cd ollama-ocli
cargo build --release
./target/release/ocli
Homebrew (macOS/Linux)
brew tap wiseowltechnet/ocli
brew install ocli
π Quick Start
- Start OCLI
ocli
- Ask the AI to help
You: create a hello world rust program
- Use planning mode for complex tasks
You: /plan build a web server with authentication
You: /next
- Monitor your session
You: /monitor
- Configure MCP servers
mkdir -p .ocli
cat > .ocli/mcp_servers.json << JSON
{
"servers": [
{
"name": "filesystem",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
}
]
}
JSON
π― Use Cases
Self-Improvement
OCLI can modify itself:
You: add a /version command to show OCLI version
AI: *reads src/main.rs, adds version command, rebuilds*
Project Scaffolding
You: /plan create a REST API with user authentication
You: /next
Code Review
You: review the code in src/main.rs for potential issues
Debugging
You: the tests are failing, can you fix them?
π Slash Commands
| Command | Description |
|---|---|
/help | Show all commands |
/plan <task> | Create step-by-step plan |
/next | Execute next plan step |
/show-plan | View plan progress |
/read <file> | Read file |
/write <file> | Write file |
/preview | Preview pending changes |
/apply | Apply pending changes |
/rollback | Undo last change |
/todo <task> | Add TODO item |
/done <id> | Mark TODO complete |
/rule <rule> | Add project rule |
/context <info> | Add context |
/mcp list | List MCP tools |
/mcp call <tool> | Call MCP tool |
/config <cmd> | Manage settings |
/export [file] | Export conversation |
/stats | Show statistics |
/git <cmd> | Git operations |
/monitor | Real-time monitor |
/version | Show version |
/clear | Clear context |
/exit | Exit OCLI |
π¨ LCARS Colors
- Orange (#FF9900) - Headers, borders
- Purple (#CC99FF) - Session info
- Blue (#99CCFF) - Success indicators
- Yellow (#FFCC00) - Warnings
- Red (#FF6666) - Errors
π§ Configuration
OCLI stores configuration in .ocli/:
config.json- User settingsmcp_servers.json- MCP server configurationsessions/- Conversation history
π€ Contributing
OCLI is designed to be self-improving. You can:
- Ask OCLI to add features to itself
- Submit PRs with new capabilities
- Create MCP servers for extended functionality
π License
MIT
π Acknowledgments
- Built with Ollama
- Inspired by Claude Code
- LCARS design from Star Trek
- MCP protocol support
Made with π¦ by the OCLI community
Development & QA
OCLI uses professional QA tools similar to Gradle:
# Format code (like Spotless)
make fmt
# Lint code (like Checkstyle/SpotBugs)
make lint
# Run tests (like JUnit)
make test
# Full QA pipeline
make qa
# CI pipeline
make ci
See QA_TOOLS.md for complete guide.
