Vibe Local
Free AI coding environment: Ollama + Python
Ask AI about Vibe Local
Powered by Claude ¡ Grounded in docs
I know everything about Vibe Local. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
vibe-local
âââ âââââââââââââ ââââââââ
âââ ââââââââââââââââââââââ
âââ ââââââââââââââââââââ
ââââ âââââââââââââââââââââ
âââââââ âââââââââââââââââââ
âââââ ââââââââââ ââââââââ
âââ âââââââ âââââââ ââââââ âââ
âââ ââââââââââââââââââââââââââââ
âââ âââ ââââââ âââââââââââ
âââ âââ ââââââ âââââââââââ
ââââââââââââââââââââââââââââ âââââââââââ
ââââââââ âââââââ ââââââââââ âââââââââââ
Free AI Coding Agent â Offline, Local, Open Source
Single-file Python agent, stdlib only, zero dependencies. No API keys. No cloud. No cost.
ăŞăăŠă¤ăłăŽăŻăźăŻăˇă§ăăă§AIă¨ăźă¸ă§ăłăă使ăŁăŚĺŚçżč ăăľăăźăăăăăććăăŠăłăŤćŞĺ ĺ ĽăŽĺŚçăă¨ăźă¸ă§ăłăăłăźăăŁăłă°ă硴çżăăăăăăăăŻăźăŻăŽăŞăç°ĺ˘ă§čŞçśč¨čŞă使ăŁăŚăżăźăăăŤćä˝ăĺŚăă ăââăăăŞĺ ´é˘ăćłĺŽăăăéĺśĺŠăŽç 犜ăťćč˛çŽçăŽăŚăźăăŁăŞăăŁăăźăŤă§ăă
Built for offline workshops where instructors support learners with AI agents, for students without paid plans who want to practice agent coding, and for beginners learning terminal operations through natural language â a non-profit research and education utility.
é˘ĺ猝线塼ä˝ĺä¸ä˝żç¨AI䝣çčž ĺŠĺŚäš č ăćŞčŽ˘é äťč´ščŽĄĺçĺŚççťäš 䝣ççźç¨ă䝼ĺĺĺŚč éčżčŞçśčŻč¨ĺŚäš çťçŤŻćä˝çĺşćŻďźčżćŻä¸ä¸ŞéčĽĺŠć§çç 犜ä¸ćč˛ĺŽç¨ĺˇĽĺ ˇă
ćĽćŹčŞ | ăăăăćĽćŹčŞ | English | ä¸ć
ăăăŻä˝ďź
MacăWindowsăLinuxăŤăłăăłăăăłăăăăă ăă§AIăăłăźăăć¸ăăŚăăăç°ĺ˘ă ăăăăŻăźăŻä¸čŚăťĺŽĺ ¨çĄćăPython + Ollama ă ăă§ĺăĺŽĺ ¨OSSăŽăłăźăăŁăłă°ă¨ăźă¸ă§ăłăă
ă¨ăźă¸ă§ăłăăŽăłă˘ vibe-coder.py 㯠Python ć¨ćşăŠă¤ăăŠăŞă ăă§ć¸ăăăĺä¸ăăĄă¤ăŤă§ăă pip install ä¸čŚăĺ¤é¨ăăăąăźă¸äžĺăźăăă˝ăźăšăłăźăăŻăăŽăžăžčŞăăăăăAIăłăźăăŁăłă°ă¨ăźă¸ă§ăłăăŽäťçľăżăĺŚăśććă¨ăăŚăăç 犜ăŽăăźăšăŠă¤ăłă¨ăăŚă使ăăžăăăăšăŚăăŞăźăăłă˝ăźăš (MIT) ă§ĺ
ŹéăăăŚăăžăă
vibe-local â vibe-coder.py (OSS, Python stdlib only, ~7400čĄ) â Ollama (ç´ćĽé俥)
ăă°ă¤ăłä¸čŚăťNode.jsä¸čŚăťăăăăˇăăăťăšä¸čŚă16ĺăŽĺ čľăăźăŤăăľăă¨ăźă¸ă§ăłăă丌ĺă¨ăźă¸ă§ăłăăăăĄă¤ăŤçŁčŚăçťĺăťPDFčŞăżĺă寞ĺżăMCPéŁćşăťăšăăŤăˇăšăă ăťPlan/Actă˘ăźăăťGităă§ăăŻăă¤ăłăăťčŞĺăăšăăťĺşĺŽăăăżăź(DECSTBM)ćčźă787ăăšăă
ă¤ăłăšăăźăŤ (3ăšăăă)
1. ăżăźăăăŤăéăďźMac: Spotlight Cmd+Space â "ăżăźăăăŤ"ă§ć¤ç´˘ / Windows: PowerShellăéăďź
2. 䝼ä¸ăăłăăăăŚEnter:
Mac / Linux / Windows(WSL) ăŽĺ ´ĺ:
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
Windows (PowerShell) ăŽĺ ´ĺ:
Invoke-Expression (Invoke-RestMethod -Uri https://raw.githubusercontent.com/ochyai/vibe-local/main/install.ps1)
3. ć°ăăăżăźăăăŤăéăăŚčľˇĺ:
vibe-local
使ăćš
# 寞芹ă˘ăźăďźAIă¨äźčŠąăăŞăăăłăźăăŁăłă°ďź
vibe-local
# ăŻăłăˇă§ăăďź1ĺă ă質ĺďź
vibe-local -p "Pythonă§ăăăăăă˛ăźă ä˝ăŁăŚ"
# ă˘ăăŤăćĺćĺŽ
vibe-local --model qwen3:8b
寞ĺżç°ĺ˘
| ç°ĺ˘ | ăĄă˘ăŞ | ăĄă¤ăłă˘ă㍠| ăľă¤ăăŤăź | ĺč |
|---|---|---|---|---|
| Apple Silicon Mac (M1䝼é) | 96GB+ | gpt-oss:120b | qwen3-coder:30b | ćéć¨ĺĽ¨ ~70tok/s |
| Apple Silicon Mac (M1䝼é) | 32GB+ | qwen3-coder:30b | qwen3:8b | ć¨ĺĽ¨ |
| Apple Silicon Mac (M1䝼é) | 16GB | qwen3:8b | qwen3:1.7b | ĺĺĺŽç¨ç |
| Apple Silicon Mac (M1䝼é) | 8GB | qwen3:1.7b | ăŞă | ćä˝éĺä˝ |
| Intel Mac | 16GB+ | qwen3:8b | qwen3:1.7b | ĺä˝ăăăé ă |
| Windows (ăă¤ăăŁă) | 16GB+ | qwen3:8b | qwen3:1.7b | NVIDIA GPUć¨ĺĽ¨ |
| Windows (WSL2) | 16GB+ | qwen3:8b | qwen3:1.7b | NVIDIA GPUć¨ĺĽ¨ |
| Linux (x86_64/arm64) | 16GB+ | qwen3:8b | qwen3:1.7b | NVIDIA GPUć¨ĺĽ¨ |
ăľă¤ăăŤăźă˘ă㍠= 樊éăă§ăăŻăĺćĺăăăźăăŞăŠčť˝éăżăšăŻç¨ăčŞĺé¸ćăăăžăă
ăăŠăăŤăˇăĽăźăăŁăłă°
ăăăăĺéĄă¨č§Łćąşćł
"ollama ă辡ĺă§ăăžăăă§ăă"
open -a Ollama # macOS
ollama serve # Linux / Windows
"ă˘ăăŤăčŚă¤ăăăžăă"
ollama pull qwen3:8b
"vibe-coder.py ăčŚă¤ăăăžăă"
# ĺă¤ăłăšăăźăŤ
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
ă˘ăăŤăĺ¤ć´ăăă
nano ~/.config/vibe-local/config
# MODEL="qwen3:8b" ăĺ¤ć´
# SIDECAR_MODEL="qwen3:1.7b" # čť˝éăżăšăŻç¨ďźççĽĺŻăťčŞĺé¸ćďź
ăăăă°ăă°ă確čŞăăă
VIBE_LOCAL_DEBUG=1 vibe-local
ăăăżăźďźăšăăźăżăščĄďźăĺ´Šăă
# ăšăŻăăźăŤé ĺăçĄĺšăŤăă
VIBE_NO_SCROLL=1 vibe-local
ăżăźăăăŤćçťăŽăăăă°
# ă¨ăšăąăźăăˇăźăąăłăšăăă°ăŤč¨é˛
VIBE_DEBUG_TUI=1 vibe-local
# ăă°: ~/.vibe-tui-debug.log
寞芹ä¸ăŤăšăŻăăźăŤé ĺă診ć
> /debug-scroll
ăăăă ăŤăťăă
ăă㯠ăŞăŤďź
MacďźăžăŁăďźă WindowsďźăăăăŠăăďźă§ăAIďźăăźăăďźă ăłăźăă ăă㌠ăăă ăŠăă ă§ăă ă¤ăłăżăźăăăă ăŞăăŚă ă¤ăăăžăăăăăă ăăăăžăăă
ăăă°ăŠă 㯠Pythonďźăă¤ă˝ăłďźăŽ ăăťă ăăŽă ă ăă§ ă§ăăŚăăžăăăăăăă ă¤ăłăšăăźăŤăŻ ăăăžăăăă˝ăźăšăłăźă㯠ă ăă§ă ăăăŚăăšăăăă㍠ă¤ăăăžăăăăăś ăŞăźăăłă˝ăźăšă§ăă
ăăăăďź3ă¤ăŽ ăšăăăďź
1. ăżăźăăăŤă ă˛ăăďźMac: Cmd+Space â ăăżăźăăăŤă / Windows: PowerShellă ă˛ăăďź
2. ăă㎠ăăă ăłăăźăăŚăăŻăă¤ăăŚăEnteră ăăďź
Mac / Linux / Windows(WSL) ăŽă¨ă:
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
Windows (PowerShell) ăŽă¨ă:
Invoke-Expression (Invoke-RestMethod -Uri https://raw.githubusercontent.com/ochyai/vibe-local/main/install.ps1)
3. ăăăăă ăżăźăăăŤă ă˛ăăăŚăăăă ăă¤ďź
vibe-local
ă¤ăăăă
# AI㨠ăŻăŞăăŞăă ăăă°ăŠă ă ă¤ăă
vibe-local
# 1ăăă ă ăă¤ăăăă
vibe-local -p "Pythonă§ ăăăăăă˛ăźă ă ă¤ăăŁăŚ"
ăăă ăłăăłăďźăŻăŞăăŚăă ă¨ă㍠ă¤ăăă ăăăăďź
| ăłăăłă | ăŞăŤă ăăďź |
|---|---|
/help | ă¤ăăă ăłăăłăă ăżă |
/exit ăžă㯠/quit | ăăăďźăťăăˇă§ăłă ăťăăăăďź |
/clear | ăăăă ăă |
/model <ăŞăžă> | ă˘ăăŤă ăăă |
/status | ăăžăŽ ăăăăťăă ăżă |
/save | ăťăăˇă§ăłă ăťăăăă |
/compact | ăăăă ăżăăăăăďźăĄă˘ăŞ ăă¤ăăďź |
/plan | ăăżă¨ă ă˘ăźăďźăăăšăă ăďź |
/approve | ăăŁăă ă˘ăźăďźăăŠăłă ăăŁăă ăăďź |
/checkpoint | ăăžăŽ ăăăăăă ăťăăăă |
/rollback | ăťăăăă ăăăăă㍠ăăŠă |
/autotest | ăăŠă ăăšă ON/OFF |
/watch | ăăĄă¤ăŤăŽ ă¸ăăăă ăżăŻă ON/OFF |
/yes | ăăŠă ăăă ă˘ăźă ăŞăł |
/debug-scroll | ăăă㎠ăšăŻăăźăŤă ăăšăăă |
""" | ăŞăă ăśăăăăă ăŤă ăăăă ăă |
Ctrl+C | ă¨ăă / ăăă |
ESC | AI㎠ăăăă ă¨ăă |
ăăă¤ăăăă¨
ă ăăďźAIă ăăśăŞă ăłăăłăă ăă¤ăă¨ă ăăăžăďź
AI㯠ăăăşăă§ăŻ ăăăžăăăăžăĄăăŁă ăłăăłăă ăă¤ăă¨ă ăăăžăă
ăăă㪠ăľă¤ăł â ăă㪠ăłăăłă㯠ăăăăŞăă§ďź
| ăăă㪠ăăźăŻăźă | ăŞă ăăśăŞăďź |
|---|---|
sudo ă§ ăŻăăžă | ăă˝ăłăłăŽ ă ăă㪠ăăŁăŚăă ăăă |
chmod ă ăŻăăŁăŚăă | ăăĄă¤ăŤăŽ ăžăăă ăŞăăŞă |
| ăăżă ăăăăŞă ăŞăă ăłăăłă | ăŞăŤă ăăăă ăăăăŞăďź |
ăăăă㍠ă¤ăă ăťăăťăďź
- ăŻăă㌠ă¤ăăă¨ăăŻăăă¤ăăăŤ
nă ăă㌠ăă ăăďźăăăăă˘ăźăďź - AIă ăłăăłăă ăă¤ăžăăŤăăăăă ăăŁăŚăăďźă㨠ăăăŚăăžă
- ăăăăŞă ăłăăłă㯠ăăŁăă㍠ăăăăŞăă§ ăă ăă
- ă ăă㪠ăăĄă¤ăŤă ăă ăăŠăŤăă§ăŻ ă¤ăăăŞăă§ ăă ăă
- ăăžăŁăăă
Ctrl+Că§ ă¨ăăăăžă
English
What is this?
A free AI coding environment you can set up with a single command on your Mac, Windows, or Linux. No network required. Completely free. Python + Ollama only â a fully open-source coding agent.
The core agent vibe-coder.py is a single file written entirely with the Python standard library. No pip install needed. Zero external dependencies. The source code is human-readable as-is, making it ideal as teaching material for understanding how AI coding agents work, or as a research baseline. Everything is open source (MIT).
vibe-local â vibe-coder.py (OSS, Python stdlib only, ~7400 lines) â Ollama (direct)
No login. No Node.js. No proxy process. 16 built-in tools, sub-agents, parallel agents, file watcher, image/PDF reading. MCP integration, Skills system, Plan/Act mode, Git checkpoints, auto-test loop, fixed footer (DECSTBM). 787 tests.
Install (3 steps)
1. Open Terminal (Mac: Spotlight Cmd+Space â search "Terminal" / Windows: Open PowerShell)
2. Paste and hit Enter:
For Mac / Linux / Windows(WSL):
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
For Windows (PowerShell natively):
Invoke-Expression (Invoke-RestMethod -Uri https://raw.githubusercontent.com/ochyai/vibe-local/main/install.ps1)
3. Open a new terminal and run:
vibe-local
Usage
# Interactive mode (chat with AI while coding)
vibe-local
# One-shot (ask once)
vibe-local -p "Create a snake game in Python"
# Specify model manually
vibe-local --model qwen3:8b
Supported Environments
| Environment | RAM | Main Model | Sidecar | Notes |
|---|---|---|---|---|
| Apple Silicon Mac (M1+) | 96GB+ | gpt-oss:120b | qwen3-coder:30b | Fastest ~70tok/s |
| Apple Silicon Mac (M1+) | 32GB+ | qwen3-coder:30b | qwen3:8b | Recommended |
| Apple Silicon Mac (M1+) | 16GB | qwen3:8b | qwen3:1.7b | Very capable |
| Apple Silicon Mac (M1+) | 8GB | qwen3:1.7b | none | Minimum viable |
| Intel Mac | 16GB+ | qwen3:8b | qwen3:1.7b | Works but slower |
| Windows (Native) | 16GB+ | qwen3:8b | qwen3:1.7b | NVIDIA GPU recommended |
| Windows (WSL2) | 16GB+ | qwen3:8b | qwen3:1.7b | NVIDIA GPU recommended |
| Linux (x86_64/arm64) | 16GB+ | qwen3:8b | qwen3:1.7b | NVIDIA GPU recommended |
Sidecar model = auto-selected lighter model for permission checks, init probes, and short summaries.
Troubleshooting
Common issues and solutions
"ollama failed to start"
open -a Ollama # macOS
ollama serve # Linux / Windows
"model not found"
ollama pull qwen3:8b
"vibe-coder.py not found"
# Reinstall
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
Change model
nano ~/.config/vibe-local/config
# Change MODEL="qwen3:8b"
# SIDECAR_MODEL="qwen3:1.7b" # For lightweight tasks (optional, auto-selected)
Enable debug logging
VIBE_LOCAL_DEBUG=1 vibe-local
Footer (status bar) is garbled
# Disable scroll region
VIBE_NO_SCROLL=1 vibe-local
Debug terminal rendering
# Log escape sequences to file
VIBE_DEBUG_TUI=1 vibe-local
# Log: ~/.vibe-tui-debug.log
Diagnose scroll region interactively
> /debug-scroll
ä¸ć
čżćŻäťäšďź
ĺ¨MacăWindows ć Linuxä¸ĺŞéĺ¤ĺśç˛č´´ä¸ä¸Şĺ˝äť¤ďźAIĺ°ąč˝ĺ¸Žä˝ ĺ䝣ç ă ć éç˝çťďźĺŽĺ ¨ĺ č´šăPython + Ollama ćé çĺŽĺ ¨ĺźćşçźç¨äťŁçă
ć ¸ĺżäťŁç vibe-coder.py ćŻäť
ä˝żç¨ Python ć ĺĺşçźĺçĺä¸ćäťśă ć é pip installďźéśĺ¤é¨äžčľăćşäťŁç ç´ćĽĺŻčŻťďźé常éĺä˝ä¸şĺŚäš AIçźç¨äťŁç塼ä˝ĺççćććç 犜ĺşçşżăä¸ĺ䝼ĺźćş (MIT) 形ĺźĺ
Źĺźă
vibe-local â vibe-coder.py (ĺźćş, 纯Pythonć ĺĺş, ~7400čĄ) â Ollama (ç´ćĽé俥)
ć éçťĺ˝ăć éNode.jsăć é䝣çčżç¨ă16个ĺ ç˝ŽĺˇĽĺ ˇăĺ䝣çăĺšśčĄäťŁçăćäťśçč§ăĺžĺ/PDF话ĺćŻćăMCPéćăćč˝çłťçťăPlan/Act樥ĺźăGitćŁćĽçšăčŞĺ¨ćľčŻĺžŞçŻăĺşĺŽéĄľč(DECSTBM)ă787饚ćľčŻă
ĺŽčŁ ďź3ćĽďź
1. ćĺźçťçŤŻďźMac: Spotlight Cmd+Space â ćç´˘"çťçŤŻ" / Windows: ćĺź PowerShellďź
2. ç˛č´´äťĽä¸ĺ˝äť¤ĺšśćĺ轌ďź
Mac / Linux / Windows(WSL) çŻĺ˘:
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
Windows (PowerShell) çŻĺ˘:
Invoke-Expression (Invoke-RestMethod -Uri https://raw.githubusercontent.com/ochyai/vibe-local/main/install.ps1)
3. ćĺźć°çťçŤŻĺšśčżčĄďź
vibe-local
使ç¨ćšćł
# 交äşć¨Ąĺźďźä¸AI寚čŻçźç¨ďź
vibe-local
# ĺ揥ć§čĄďźĺŞéŽä¸ćŹĄďź
vibe-local -p "ç¨Pythonĺä¸ä¸Şč´Şĺč游ć"
# ćĺ¨ćĺŽć¨Ąĺ
vibe-local --model qwen3:8b
ćŻćççŻĺ˘
| çŻĺ˘ | ĺ ĺ | 丝樥ĺ | 螚轌樥ĺ | ĺ¤ćł¨ |
|---|---|---|---|---|
| Apple Silicon Mac (M1ĺ䝼ä¸) | 96GB+ | gpt-oss:120b | qwen3-coder:30b | ć忍 ~70tok/s |
| Apple Silicon Mac (M1ĺ䝼ä¸) | 32GB+ | qwen3-coder:30b | qwen3:8b | ć¨č |
| Apple Silicon Mac (M1ĺ䝼ä¸) | 16GB | qwen3:8b | qwen3:1.7b | čśłĺ¤ĺŽç¨ |
| Apple Silicon Mac (M1ĺ䝼ä¸) | 8GB | qwen3:1.7b | ć | ćä˝éčżčĄ |
| Intel Mac | 16GB+ | qwen3:8b | qwen3:1.7b | ĺŻčżčĄä˝čžć ˘ |
| Windows (ĺç) | 16GB+ | qwen3:8b | qwen3:1.7b | ć¨čNVIDIA GPU |
| Windows (WSL2) | 16GB+ | qwen3:8b | qwen3:1.7b | ć¨čNVIDIA GPU |
| Linux (x86_64/arm64) | 16GB+ | qwen3:8b | qwen3:1.7b | ć¨čNVIDIA GPU |
螚轌樥ĺ = ç¨äşćéćŁćĽăĺĺ§ĺć˘ćľç轝éäťťĺĄçčŞĺ¨éćŠçčžĺ°ć¨Ąĺă
ć éćé¤
常č§éŽé˘ĺč§Łĺłćšćł
"ollama ć ćłĺŻĺ¨"
open -a Ollama # macOS
ollama serve # Linux / Windows
"ćŞćžĺ°ć¨Ąĺ"
ollama pull qwen3:8b
"vibe-coder.py ćŞćžĺ°"
# éć°ĺŽčŁ
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
ć´ć˘ć¨Ąĺ
nano ~/.config/vibe-local/config
# äżŽćš MODEL="qwen3:8b"
# SIDECAR_MODEL="qwen3:1.7b" # 轝éäťťĺĄç¨ďźĺŻéďźčŞĺ¨éćŠďź
ĺŻç¨č°čŻćĽĺż
VIBE_LOCAL_DEBUG=1 vibe-local
饾čďźçśćć ďźćžç¤şĺźĺ¸¸
# çŚç¨ćťĺ¨ĺşĺ
VIBE_NO_SCROLL=1 vibe-local
č°čŻçťçŤŻć¸˛ć
# ĺ°č˝Źäšĺşĺ莰ĺ˝ĺ°ćäťś
VIBE_DEBUG_TUI=1 vibe-local
# ćĽĺż: ~/.vibe-tui-debug.log
交äşĺźčŻććťĺ¨ĺşĺ
> /debug-scroll
Architecture
ââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââ
â User â
â âââ vibe-local.sh / vibe-local.ps1 (launch script) â
â âââ Ensure Ollama is running â
â âââ Launch vibe-coder.py (direct, no proxy) â
ââââââââââââââââââââââââââŹââââââââââââââââââââââââââââââââââââ
â
âź
ââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââ
â vibe-coder.py (single file, Python stdlib only, ~7400L) â
â â
â ââââââââââââââââââââââââââââââââââââââââââââââââââââââââ â
â â Agent Loop (parallel tool execution) â â
â â User input â LLM â Tool calls â Execute â â â
â â Add results â Loop until done â â
â âââââââââââââââââââââââââââââââââââââââââââââââââââââââ⤠â
â â 16 Built-in Tools + MCP Tools â â
â â Bash (+ background), Read (+ images/PDF/ipynb), â â
â â Write, Edit (+ rich diff), Glob, Grep, â â
â â WebFetch, WebSearch, NotebookEdit, SubAgent, â â
â â ParallelAgents, TaskCreate/List/Get/Update, â â
â â AskUserQuestion â â
â âââââââââââââââââââââââââââââââââââââââââââââââââââââââ⤠â
â â v1.0 Extensions â â
â â MCP Client (JSON-RPC 2.0, stdio, tool discovery) â â
â â Skills System (.md files â system prompt injection) â â
â â Plan/Act Mode (read-only â execution transition) â â
â â Git Checkpoint (stash-based rollback) â â
â â Auto Test Loop (lint + test after edits) â â
â âââââââââââââââââââââââââââââââââââââââââââââââââââââââ⤠â
â â v1.1 Extensions â â
â â File Watcher (poll-based change detection) â â
â â Parallel Agents (multi-task concurrent execution) â â
â â Streaming Enhancement (tool call delta accumulation)â â
â âââââââââââââââââââââââââââââââââââââââââââââââââââââââ⤠â
â â v1.3 Extensions â â
â â ScrollRegion (DECSTBM fixed footer, store-only) â â
â â ESC Interrupt (immediate generation stop) â â
â â Type-Ahead Input (buffered during response) â â
â â Debug Logging (VIBE_DEBUG_TUI=1 â log file) â â
â âââââââââââââââââââââââââââââââââââââââââââââââââââââââ⤠â
â â System Prompt + OS-Specific Hints â â
â â macOS: brew, /Users/, system_profiler â â
â â Linux: apt, /home/ â â
â â Windows: winget, %USERPROFILE% â â
â âââââââââââââââââââââââââââââââââââââââââââââââââââââââ⤠â
â â XML Tool Call Fallback (Qwen model compatibility) â â
â â Permission Manager (safe / ask / deny tiers) â â
â â Session Persistence (JSONL) + Context Compaction â â
â â TUI (readline, ANSI colors, markdown rendering) â â
â â Multimodal (image base64 â Ollama vision models) â â
â ââââââââââââââââââââââââŹââââââââââââââââââââââââââââââââ â
âââââââââââââââââââââââââââźâââââââââââââââââââââââââââââââââââ
â OpenAI Chat API (/v1/chat/completions)
âź
ââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââ
â Ollama (localhost:11434) â
â Local LLM inference runtime â
â qwen3-coder:30b / qwen3:8b / qwen3:1.7b / ... â
ââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââââ
Comparison with Similar Projects / éĄäźźăăă¸ă§ăŻăă¨ăŽćŻčź
AIăłăźăăŁăłă°ă¨ăźă¸ă§ăłăăŽĺéăŤăŻăç´ ć´ăăăăŞăźăăłă˝ăźăšăăă¸ă§ăŻăăć°ĺ¤ăĺĺ¨ăăžăăăăăăç°ăŞăĺ˛ĺŚă¨ăŚăźăšăąăźăšăŤĺşăĽăăŚč¨č¨ăăăŚăăăvibe-local ăăăŽä¸ă¤ă¨ăăŚăç 犜ăťćč˛ă¨ăăçšĺŽăŽăăăăŤçŚçšăĺ˝ăŚăŚăăžăă
There are many excellent open-source projects in the AI coding agent space. Each is built with a different philosophy and use case in mind. vibe-local contributes to this ecosystem by focusing specifically on research and education.
| aider | opencode | Cline | Codex CLI | Gemini CLI | Goose | vibe-local | |
|---|---|---|---|---|---|---|---|
| Language | Python | Go | TypeScript | Rust | TypeScript | Rust + TS | Python (stdlib only) |
| External deps | ~100+ pip pkgs | Go modules | VS Code + npm | Node.js | Node.js | Cargo crates | 0 |
| Local LLM | Yes (many backends) | Yes (config) | Yes (providers) | No | No | Yes | Yes (Ollama native) |
| API key required | Yes (or local) | Yes (or local) | Yes (or local) | Yes (OpenAI) | Yes (Google) | Yes (or local) | No |
| Install | pip install | go install / brew | VS Code marketplace | npm install | npm install | Binary / installer | curl | bash |
| Interface | Terminal | Terminal (rich TUI) | VS Code | Terminal | Terminal | Terminal + Desktop | Terminal |
| Strength | Git-aware, multi-model | Beautiful TUI, speed | Deep IDE integration | OpenAI ecosystem | Google ecosystem | Extensible, MCP | Simplicity, education, MCP, parallel agents |
| License | Apache 2.0 | MIT | Apache 2.0 | Apache 2.0 | Apache 2.0 | Apache 2.0 | MIT |
aider is one of the most mature CLI tools, with excellent git integration and multi-model support. opencode stands out with its beautiful Bubble Tea TUI and fast Go implementation. Cline provides deep VS Code integration that feels native. Codex CLI and Gemini CLI bring the power of OpenAI and Google ecosystems respectively. Goose (by Block) offers an extensible MCP-based agent framework. These are all excellent tools built by talented teams â if you're a professional developer, you should try them.
aider ăŻgitçľąĺă¨ăăŤăă˘ăăŤĺŻžĺżă§ćăćçăăCLIăăźăŤăŽä¸ă¤ăopencode ăŻçžăăBubble Tea TUIă¨éŤéăŞGoĺŽčŁ ăçšĺž´ăCline ăŻVS Codeă¨ăŽăă¤ăăŁăăŞçľąĺăćäžăCodex CLIăťGemini CLI ăŻOpenAI/Googleă¨ăłăˇăšăă ăŽĺăć´ťç¨ăGoose (Block礞) ăŻMCPăăźăšăŽćĄĺźľĺŻč˝ăŞă¨ăźă¸ă§ăłăăăŹăźă ăŻăźăŻăăăăăćč˝ăăăăźă ăä˝ăŁăç´ ć´ăăăăăźăŤă§ăăăăăŽéçşč ăŽćšăŻăă˛čŠŚăăŚăżăŚăă ăăă
vibe-local ăŻĺĽăŽă˘ăăăźăăĺăăžăďź1ăăĄă¤ăŤăĺ¤é¨äžĺăźăăPythonć¨ćşăŠă¤ăăŠăŞăŽăżăăăăŽéçşč ĺăă§ăŻăŞăăăAIă¨ăźă¸ă§ăłăăŽäťçľăżăĺ ĺ´ăăĺŚăłăăăăăŞăăŠă¤ăłăŽć厤ă§ä˝żăăăăăă˝ăźăšăłăźăăĺĺž1ĺă§ĺ ¨é¨čŞăżăăăă¨ăăäşşăŽăăăŤä˝ăăžăăă
Why vibe-local? / ăŞă vibe-localďź
For educators and researchers / ćč˛č ăťç 犜č ăŽăăăŤďź
- Zero setup friction / ăťăăă˘ăăăŽćŠćŚăźă â
curl | bashă§ĺ ¨ăŚĺŽäşăpip install ă npm ă venv ăä¸čŚăĺŚçăŻăłăăłă1ă¤ă§AIăłăźăăŁăłă°ăéĺ§ă§ăăžăă - Single file, readable source / 1ăăĄă¤ăŤăčŞăăă˝ăźăš â
vibe-coder.pyăŻĺ¤é¨äžĺăźăăŽĺä¸ăăĄă¤ăŤăAIă¨ăźă¸ă§ăłăăăăźăŤä˝żç¨ăăăăłăăă¨ăłă¸ăă˘ăŞăłă°ăŽććĽććă¨ăăŚćéŠă§ăă - Fully offline / ĺŽĺ ¨ăŞăăŠă¤ăł â ă¤ăłăżăźăăăăŽăŞăć厤ăéŁčĄćŠăĺ°ćšă§ăĺä˝ăă˘ăăŤăäşĺDLăăŚUSBă§é ĺ¸ĺŻč˝ă
- Pure Python stdlib / ç´ç˛ăŞPythonć¨ćşăŠă¤ăăŠăŞ â CćĄĺźľăŞăăăłăłăă¤ăŤć¸ăżăă¤ăăŞăŞăă䝎ćłç°ĺ˘ä¸čŚăPython 3.8+ 㨠Ollama ăăăă°ĺăăžăă
- Research-friendly / ç 犜ăăăă â ĺä¸ăăĄă¤ăŤč¨č¨ăŤăăăă¨ăźă¸ă§ăłăčĄĺăăăźăŤä˝żç¨ăăżăźăłăLLMć§č˝ăŽĺŽé¨ăťč¨ć¸Źăťćšĺ¤ă厚ćă§ăă
If you're a professional developer looking for the best coding assistant, check out aider, opencode, Cline, or Goose â they are all excellent tools built by talented communities. If you're an educator, researcher, or student who wants to understand how AI coding agents work from the inside, or need something that runs offline with zero dependencies, vibe-local is for you.
ăăăŽéçşč ă§ćéŤăŽăłăźăăŁăłă°ă˘ăˇăšăżăłăăć˘ăăŚăăćšăŻăaiderăopencodeăClineăGoose ăăĺ§ăăăžăăăăăăç´ ć´ăăăăłăăĽăăăŁăŤăăŁăŚä˝ăăăĺŞăăăăźăŤă§ăăAIăłăźăăŁăłă°ă¨ăźă¸ă§ăłăăŽäťçľăżăĺ ĺ´ăăçč§Łăăăćč˛č ăťç 犜č ăťĺŚçăŽćšăăžăăŻăŞăăŠă¤ăłă§äžĺé˘äżăźăă§ĺăăăŽăĺż čŚăŞćšăŤăŻăvibe-local ăăăăžăă
CLI Reference
CLI Flags
| Flag | Short | Description | 誏ć | 说ć |
|---|---|---|---|---|
--prompt | -p | One-shot prompt (non-interactive) | ăŻăłăˇă§ăăăăăłăă | ĺ揥ć示 |
--model | -m | Specify Ollama model name | Ollamaă˘ăăŤĺăćĺŽ | ćĺŽOllama樥ĺ |
--yes | -y | Auto-approve all tool calls | ĺ ¨ăăźăŤčŞĺč¨ąĺŻ | čŞĺ¨ćšĺććĺˇĽĺ ˇ |
--debug | Enable debug logging | ăăăă°ăă°ćĺšĺ | ĺŻç¨č°čŻćĽĺż | |
--resume | Resume last session | ćĺžăŽăťăăˇă§ăłĺé | ć˘ĺ¤ä¸ä¸ä¸ŞäźčŻ | |
--session-id <id> | Resume specific session | ćĺŽăťăăˇă§ăłĺé | ć˘ĺ¤çšĺŽäźčŻ | |
--list-sessions | List saved sessions | ăťăăˇă§ăłä¸čڧ | ĺĺşäźčŻ | |
--ollama-host <url> | Ollama API endpoint | Ollamaă¨ăłăăă¤ăłă | Ollama APIçŤŻçš | |
--max-tokens <n> | Max output tokens (default: 8192) | ć大ĺşĺăăźăŻăłć° | ć大čžĺşäť¤çć° | |
--temperature <f> | Sampling temperature (default: 0.7) | ăľăłăăŞăłă°ć¸ŠĺşŚ | éć ˇć¸ŠĺşŚ | |
--context-window <n> | Context window size (default: 32768) | ăłăłăăăšăăŚăŁăłă㌠| ä¸ä¸ćçŞĺŁ | |
--version | Show version and exit | ăăźă¸ă§ăłčĄ¨ç¤ş | ćžç¤şçćŹ |
Interactive Commands
| Command | Description | 誏ć | 说ć |
|---|---|---|---|
/help | Show commands | ăłăăłăä¸čڧ | ćžç¤şĺ˝äť¤ |
/exit, /quit, /q | Exit (auto-saves) | çľäşďźčŞĺäżĺďź | éĺşďźčŞĺ¨äżĺďź |
/clear | Clear history | 幼ć´ăŻăŞă˘ | ㏠é¤ĺĺ˛ |
/model <name> | Switch model | ă˘ăăŤĺćż | ĺć˘ć¨Ąĺ |
/models | List installed models with tiers | ă˘ăăŤä¸čڧďźăăŁă˘čĄ¨ç¤şďź | 樥ĺĺ襨ďźĺąçş§ďź |
/status | Session info | ăťăăˇă§ăłć ĺ ą | äźčŻäżĄćŻ |
/save | Save session | ăťăăˇă§ăłäżĺ | äżĺäźčŻ |
/compact | Compress history | 幼ć´ĺ§ç¸Ž | ĺ矊ĺĺ˛ |
/tokens | Token usage | ăăźăŻăłä˝żç¨é | 䝤ç使ç¨é |
/undo | Undo last write/edit | ćĺžăŽć¸ăčžźăżăĺ ăŤćťă | ć¤éä¸ćŹĄĺĺ Ľ |
/config | Show config | č¨ĺŽčĄ¨ç¤ş | ćžç¤şé 罎 |
/commit | Git stage + commit | gităłăăă | gitć交 |
/diff | Show git diff | git diff襨示 | ćžç¤şgit diff |
/git <cmd> | Run git subcommand | gităľăăłăăłă | čżčĄgitĺĺ˝äť¤ |
/plan | Plan mode (read-only analysis) | ăăŠăłă˘ăźăďźčŞăżĺăĺ°ç¨ďź | 莥ĺ樥ĺźďźĺŞčŻťĺćďź |
/approve, /act | Switch to act mode (execute plan) | Actă˘ăźăĺćżďźĺŽčĄďź | ĺć˘ĺ°ć§čĄć¨Ąĺź |
/checkpoint | Save git checkpoint | Gităă§ăăŻăă¤ăłăäżĺ | äżĺGitćŁćĽçš |
/rollback | Rollback to last checkpoint | ăă§ăăŻăă¤ăłăăŤćťă | ĺćťĺ°ä¸ä¸ä¸ŞćŁćĽçš |
/autotest | Toggle auto lint+test after edits | čŞĺăăšăON/OFF | čŞĺ¨ćľčŻĺźĺ ł |
/watch | Toggle file watcher | ăăĄă¤ăŤçŁčŚON/OFF | ćäťśçč§ĺźĺ ł |
/skills | List loaded skills | ăšăăŤä¸čڧ | ĺĺşĺˇ˛ĺ č˝˝ćč˝ |
/init | Create CLAUDE.md | CLAUDE.mdä˝ć | ĺĺťşCLAUDE.md |
/yes | Enable auto-approve | čŞĺ訹ĺŻON | ĺŻç¨čŞĺ¨ćšĺ |
/debug-scroll | Diagnose scroll region | ăšăŻăăźăŤč¨şć | čŻććťĺ¨ĺşĺ |
exit, quit, bye | Exit (no / needed) | çľäş | éĺş |
""" | Multi-line input | č¤ć°čĄĺ Ľĺ | ĺ¤čĄčžĺ Ľ |
ESC | Stop AI response | AIĺżçĺć˘ | ĺć˘AIĺĺş |
Ctrl+C | Stop (double-tap to exit) | ĺć˘ďź2ĺă§çľäşďź | ĺć˘ďźčżćéĺşďź |
Configuration
Config File
~/.config/vibe-local/config
Format: KEY="value". Lines starting with # are comments.
| Key | Default | Description |
|---|---|---|
MODEL | auto (by RAM) | Main model name |
SIDECAR_MODEL | auto (by RAM) | Sidecar model (lighter, for compaction etc.) |
OLLAMA_HOST | http://localhost:11434 | Ollama API endpoint |
MAX_TOKENS | 8192 | Max output tokens per response |
TEMPERATURE | 0.7 | Sampling temperature |
CONTEXT_WINDOW | 32768 | Context window size in tokens |
Example:
# ~/.config/vibe-local/config
MODEL="qwen3:8b"
SIDECAR_MODEL="qwen3:1.7b"
OLLAMA_HOST="http://localhost:11434"
Model Tiers
vibe-local auto-detects installed Ollama models and picks the best one for your RAM. Use /models to see tiers.
| Tier | RAM (practical) | Models | Quality | Speed |
|---|---|---|---|---|
| S Frontier | 768GB+ | deepseek-r1:671b, deepseek-v3:671b | Best reasoning | Slow |
| A Expert | 256GB+ | qwen3:235b, llama3.1:405b | Excellent | Moderate |
| B Advanced | 96GB+ | gpt-oss:120b, llama3.3:70b, mixtral:8x22b | Very strong | Good (~70tok/s for gpt-oss) |
| C Solid | 16GB+ | qwen3-coder:30b, qwen2.5-coder:32b | Good balance | Fast |
| D Light | 8GB+ | qwen3:8b, llama3.1:8b | Decent | Very fast |
| E Minimal | 4GB+ | qwen3:1.7b, llama3.2:3b | Basic | Instant |
RAM column shows practical minimum for interactive use (model + KV cache + OS). 671B models are not auto-selected on 512GB machines â use
MODEL=to force.
MCP (Model Context Protocol)
vibe-coder supports MCP servers for extending tool capabilities. Configure in ~/.config/vibe-local/mcp.json or .vibe-local/mcp.json (project-level):
{
"mcpServers": {
"my-server": {
"command": "python3",
"args": ["/path/to/mcp_server.py"],
"env": {"API_KEY": "..."}
}
}
}
MCP tools are auto-discovered at startup and registered as mcp_{server}_{tool}. Compatible with the same format as Claude Code's MCP configuration.
MCPăľăźăăźăč¨ĺŽăăă¨ă辡ĺćăŤčŞĺć¤ĺşăăăăźăŤă¨ăăŚçťé˛ăăăžăăClaude CodeăŽMCPč¨ĺŽă¨ĺă形ĺźă§ăă
Skills
Place .md files in any of these directories to inject custom instructions into the system prompt:
~/.config/vibe-local/skills/ # Global skills
.vibe-local/skills/ # Project-level skills
skills/ # Project-level (alternative)
Use /skills to list loaded skills. Max 50KB per skill file. Symlinks are ignored for security.
.md ăăĄă¤ăŤăé
罎ăăă¨ăˇăšăă ăăăłăăăŤčŞĺ注ĺ
Ľăăăžăă/skills ă§ä¸čŚ§čĄ¨ç¤şă
Environment Variables
Priority: CLI flags > Environment variables > Config file > Defaults
| Variable | Description |
|---|---|
OLLAMA_HOST | Ollama API endpoint |
VIBE_CODER_MODEL | Override main model (highest priority) |
VIBE_LOCAL_MODEL | Main model (set by launcher) |
VIBE_CODER_SIDECAR | Override sidecar model |
VIBE_LOCAL_SIDECAR_MODEL | Sidecar model (set by launcher) |
VIBE_CODER_DEBUG / VIBE_LOCAL_DEBUG | Set to 1 for debug logging |
VIBE_DEBUG_TUI | Set to 1 to log escape sequences to ~/.vibe-tui-debug.log |
VIBE_NO_SCROLL | Set to 1 to disable DECSTBM scroll region (fallback mode) |
v1.0 Features / v1.0 ć°ćŠč˝
Plan/Act Mode / ăăŠăłăťă˘ăŻăă˘ăźă
Separate analysis from execution for safer, more deliberate coding:
/plan â Phase 1: Read-only exploration (Glob, Grep, Read only)
/approve â Phase 2: Full execution (all tools re-enabled)
/rollback â Undo all changes since plan started
PlanâActĺćżćăŤgităă§ăăŻăă¤ăłăăčŞĺä˝ćăăăžăă
Git Checkpoint & Rollback / Gităă§ăăŻăă¤ăłă
Automatic safety net using git stash:
- Auto-checkpoint: Created before every Write/Edit and on PlanâAct transition
/checkpointâ Manual checkpoint/rollbackâ Restore to last checkpoint
Auto Test Loop / čŞĺăăšăăŤăźă
Automatically run lint and tests after file edits:
/autotest â Toggle ON/OFF
- Auto-detects: pytest, npm test
- Python files: syntax check via
py_compile - Test failures are fed back to the LLM for self-repair
硨éĺžăŤčŞĺă§lint+ăăšăăĺŽčĄăă¨ăŠăźăŻLLMăŤăăŁăźăăăăŻăăŚčŞĺ俎ćŁă
MCP Integration / MCPéŁćş
Connect to Model Context Protocol servers:
- JSON-RPC 2.0 over stdio
- Auto-discovery of tools at startup
- Compatible with Claude Code's
mcpServersconfig format - Project-level config:
.vibe-local/mcp.json
See Configuration > MCP for setup.
Skills System / ăšăăŤăˇăšăă
Load custom .md instruction files into the system prompt:
- Global:
~/.config/vibe-local/skills/*.md - Project:
.vibe-local/skills/*.md /skillsto list loaded skills
ăŤăšăżă ć示ă .md ăăĄă¤ăŤăăăˇăšăă ăăăłăăăŤćł¨ĺ
Ľă
v1.1 Features / v1.1 ć°ćŠč˝
File Watcher / ăăĄă¤ăŤçŁčŚ
Automatically detect external file changes and notify the LLM:
/watch â Toggle file watcher ON/OFF
- Poll-based (2s interval), watches common source file extensions (.py, .js, .ts, .html, .css, .json, .go, .rs, etc.)
- Detects: file created, modified, deleted
- Changes are injected as system notes before the next LLM call
- Snapshot refreshes after Write/Edit to avoid false positives
ĺ¤é¨ă¨ăăŁăżă§ăŽăăĄă¤ăŤĺ¤ć´ăčŞĺć¤ĺşăăŚLLMăŤéçĽăWrite/EditĺžăŻăšăăăăˇă§ăăć´ć°ă§čޤć¤çĽé˛ć˘ă
Parallel Agents / 丌ĺă¨ăźă¸ă§ăłă
Run multiple sub-agents concurrently for faster multi-task execution:
ParallelAgentstool: accepts 1-4 tasks, runs them in parallel threads- Each task is an independent sub-agent with its own context
- 5-minute timeout per agent, max 4 concurrent
- LLM automatically chooses ParallelAgents when multiple independent tasks are detected
č¤ć°ăľăă¨ăźă¸ă§ăłăă丌ĺĺŽčĄăçŹçŤăăăżăšăŻăĺćĺŚçăăŚćéç縎ă
Streaming Enhancement / ăšăăŞăźăăłă°ĺźˇĺ
Infrastructure for streaming tool call responses from Ollama:
- TUI accumulates tool_call deltas from SSE stream chunks
_supports_tool_streamingflag for Ollama version detection- Falls back to sync mode when tool streaming is not supported
ăăźăŤăłăźăŤĺżçăŽăšăăŞăźăăłă°ĺşç¤ăOllamaăăźă¸ă§ăłăŤĺżăăŚčŞĺĺćżă
v1.3 Features / v1.3 ć°ćŠč˝
Fixed Footer (DECSTBM Scroll Region) / ĺşĺŽăăăżăź
Terminal uses VT100 DECSTBM to pin a 3-row footer (separator, status, hints) at the bottom while AI output scrolls above.
ăżăźăăăŤăŽVT100 DECSTBMćŠč˝ă§ăä¸é¨3čĄďźăťăăŹăźăżăăšăăźăżăšăăăłăďźăĺşĺŽčĄ¨ç¤şăAIĺşĺăŻä¸é¨ă§ăšăŻăăźăŤă
âââââââââââââââââââââââââââââââââââââââ
â AI output scrolls here â â Scroll region
â ... â
ââââââââââââââââââââââââââââââââââââââ⤠â Separator
â ⌠Ready â â Status line
â /help â """ multi-line â Ctrl+C â â Hint bar
âââââââââââââââââââââââââââââââââââââââ
- Store-only pattern:
update_status()/update_hint()only store text. Footer drawn atomically duringsetup()andresize(). - Thread-safe: Non-blocking lock in
resize()prevents SIGWINCH deadlock. All state checks inside lock. - Fallback:
VIBE_NO_SCROLL=1disables scroll region for incompatible terminals. - Debug:
VIBE_DEBUG_TUI=1logs all escape sequences to~/.vibe-tui-debug.log. - Diagnostic:
/debug-scrollcommand tests DECSTBM behavior interactively.
ăăŠăźăŤăăăŻ: VIBE_NO_SCROLL=1 ă§ăšăŻăăźăŤé ĺçĄĺšĺăVIBE_DEBUG_TUI=1 ă§ă¨ăšăąăźăăˇăźăąăłăšăăă°ĺşĺă
ESC Interrupt / ESCăăźĺ˛ăčžźăż
Press ESC during AI response to stop generation immediately. Faster than Ctrl+C.
AIĺżçä¸ăŤ ESC ăăźă§ĺłĺş§ăŤçćĺć˘ăCtrl+C ăăéŤéă
Type-Ahead Input / ĺ čĄĺ Ľĺ
Start typing while the AI is still responding. Input is buffered and ready when the prompt appears.
AIĺżçä¸ăŤćŹĄăŽăăăłăăăĺ Ľĺéĺ§ĺŻč˝ăĺ ĽĺăŻăăăăĄăăăăăăłăă襨示ćăŤăăŽăžăžĺŠç¨ă
Security
Use this tool at your own risk. Pay attention to the commands the AI executes.
vibe-local offers normal mode (confirms each action) and auto-approve mode (-y).
Local LLMs are less accurate than cloud AI â they may attempt dangerous operations unintentionally.
Watch for these keywords
| Keyword | Risk |
|---|---|
sudo | Admin privileges â affects entire system |
chmod / chown | Changes file permissions |
dd / mkfs / /dev/ | Direct disk operations |
> overwriting configs | Settings may be erased |
--force | Skips safety checks |
| Long commands you don't understand | If you can't read it, don't allow it |
Safe usage rules
- Choose
n(normal mode) on first launch â approve each action - Never allow commands you don't understand
- Practice in a new empty folder
- Reject
sudorequests Ctrl+Cto stop at any time
Built-in Security Mechanisms
| Mechanism | Description |
|---|---|
| SAFE_TOOLS vs ASK_TOOLS | Read/Glob/Grep/SubAgent/TaskTools are auto-approved. Bash/Write/Edit require confirmation. WebFetch/WebSearch need extra context. |
| SSRF prevention | OLLAMA_HOST restricted to localhost only |
| URL scheme validation | Only http:// and https:// allowed |
| Session ID sanitization | Path traversal prevention |
| Max iteration limit | Agent loop stops after 50 iterations |
| Symlink protection | Refuses to read/write through symlinks |
| Protected path blocking | Blocks writes to config/permission files |
| Dangerous command detection | Blocks `curl |
Workshop Guide
For instructors / čŹĺ¸Ťĺă
# 1. Pre-install on venue computers (while online)
curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash
# 2. Pre-download models (for offline use)
ollama pull qwen3:8b # For 16GB machines
ollama pull qwen3-coder:30b # For 32GB machines (recommended)
# 3. Verify
vibe-local -p "Write Hello World in Python"
Starter exercises / 課éĄäž
1. "Create a rock-paper-scissors game in Python" â Basic programming
2. "List all files in this folder" â Terminal operations
3. "Create a timer app in HTML and open it" â Web development
4. "Create minesweeper in HTML" â Game development
5. "Check the current system information" â OS operations
Offline Capabilities
| Feature | Offline | Notes |
|---|---|---|
| Code generation & execution | Yes | All processed locally |
| File operations | Yes | |
| Terminal commands | Yes | |
| Git (local) | Yes | push/pull need network |
| HTML app creation | Yes | Opens in browser |
| Plan/Act mode | Yes | |
| Git checkpoint & rollback | Yes | |
| Auto test loop | Yes | |
| MCP servers (local) | Yes | Depends on MCP server |
| Skills system | Yes | |
| File watcher | Yes | |
| Parallel agents | Yes | |
| Fixed footer (DECSTBM) | Yes | VIBE_NO_SCROLL=1 to disable |
| Web search | Online only | DuckDuckGo |
| URL fetch | Online only | |
| Package install | Online only | pip/brew/winget |
Legal
What this tool does:
- Runs
vibe-coder.py, a fully open-source Python coding agent - Communicates directly with Ollama (open-source LLM runtime) running locally
- Optionally connects to MCP servers (local processes, user-configured)
- No communication with external servers (Web search/fetch are optional)
- Does not use any Anthropic software
Licenses:
- vibe-coder.py: MIT License
- Ollama: MIT License
- Qwen3 models: Apache 2.0 License
- vibe-local: MIT License
All components are open-source. This tool is intended for research and education.
Disclaimer
This project is NOT affiliated with, endorsed by, or associated with Anthropic. "Claude" is a trademark of Anthropic, PBC. This is an unofficial community tool.
Since v0.3.0, this tool does not use any proprietary software. All components (vibe-coder.py, Ollama, Qwen3 models) are open-source licensed.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND. The authors are not liable for any damages arising from the use of this software. Use entirely at your own risk.
ćŹăăă¸ă§ăŻă㯠Anthropic 礞ă¨ăŻä¸ĺé˘äżăăăžăăăv0.3.0 䝼éăăăăăŠă¤ă¨ăżăŞă˝ăăăŚă§ă˘ă使ç¨ăăŚăăžăăăćŹă˝ăăăŚă§ă˘ăŻçžçśćĺ§żďźAS ISďźă§ćäžăăăăăăŞăäżč¨źăăăăžăăă
ćŹéĄšçŽä¸ Anthropic ĺ Źĺ¸ć äťťä˝ĺ łčăčŞv0.3.0辡ä¸ä˝żç¨äťťä˝ä¸ć软䝜ăćŹč˝Żäťść"ĺć ˇ"ćäžďźä¸ćäžäťťä˝äżčŻă
License
MIT
