Glm Coding Plan Statusline
GLM Coding Plan 智能状态栏 - 帮助用户实时掌握套餐使用情况
Ask AI about Glm Coding Plan Statusline
Powered by Claude · Grounded in docs
I know everything about Glm Coding Plan Statusline. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
GLM Coding Plan Statusline
Smart Status Bar for GLM Coding Plan
Real-time usage monitoring for GLM Coding Plan users
Features
- Real-time Quota Monitoring - Display MCP monthly quota usage percentage
- Token Usage Tracking - Monthly/Daily/Session level token consumption statistics
- Context Progress Bar - Visualize context window usage
- Smart Color Alerts - Automatic color change warnings based on usage rate
- Smart Caching - Reduce API requests, improve response speed
- Flexible Configuration - Support multiple display modes
- GSD Bridge Compatible - Works with GSD context-monitor for low context warnings
Requirements
- Node.js: Version ≥ 16.0.0
- Claude Code: Used with GLM Coding Plan
- GLM Coding Plan: Valid ANTHROPIC_AUTH_TOKEN required
Quick Start
Add to ~/.claude/settings.json:
{
"statusLine": {
"type": "command",
"command": "npx @wangjs-jacky/glm-coding-plan-statusline@latest"
}
}
Save and restart Claude Code to see the status bar!
Display Example
GLM-5 │ Sess:160.0K │ Day:42.8M │ Mon:979.2M
5H ██░░░░░░ 22% │ MCP ███░░░░░ 28% │ Context █████░░░ 68% (200K)
Fields
Line 1: Token Statistics
| Field | Description | Color |
|---|---|---|
| GLM-5 | Current model | Cyan bold |
| Sess:160.0K | Session tokens | Gray |
| Day:42.8M | Daily tokens | Default |
| Mon:979.2M | Monthly tokens | Blue |
Line 2: Quota Progress Bars
| Field | Description | Color Rules |
|---|---|---|
| 5H | 5-hour quota used | Green(<50%) / Yellow(50-80%) / Red(>80%) |
| MCP | Monthly quota used | Green(<50%) / Yellow(50-80%) / Red(>80%) |
| Context | Context window usage | Green(<50%) / Yellow(50-80%) / Red(>80%) |
GSD Bridge Compatibility
This statusline is compatible with Get Shit Done (GSD) framework's context monitoring feature.
How It Works
When Claude Code calls the statusline, it automatically writes context metrics to a bridge file:
/tmp/claude-ctx-{session_id}.json
This file can be read by GSD's gsd-context-monitor hook to inject low context warnings to the agent.
Bridge File Format
{
"session_id": "abc123",
"remaining_percentage": 65,
"used_pct": 35,
"timestamp": 1742053200
}
Using with GSD
If you have GSD installed, the context-monitor hook will automatically read these metrics and warn the agent when context is running low (≤35% warning, ≤25% critical).
Options
# Full mode (two lines, recommended)
npx @wangjs-jacky/glm-coding-plan-statusline
# Compact mode (single line)
npx @wangjs-jacky/glm-coding-plan-statusline --compact
# Local mode (no API requests, context only)
npx @wangjs-jacky/glm-coding-plan-statusline --local
# Clear cache
npx @wangjs-jacky/glm-coding-plan-statusline --clear-cache
# Show help
npx @wangjs-jacky/glm-coding-plan-statusline --help
Environment Variables
Ensure these environment variables are set (usually in settings.json env field):
{
"env": {
"ANTHROPIC_AUTH_TOKEN": "your-token-here",
"ANTHROPIC_BASE_URL": "https://open.bigmodel.cn/api/anthropic"
}
}
License
MIT License - see LICENSE file for details.
Contributing
Issues and Pull Requests are welcome!
Contact
- Author: wangjs-jacky
- GitHub: https://github.com/wangjs-jacky/glm-coding-plan-statusline
- Issues: https://github.com/wangjs-jacky/glm-coding-plan-statusline/issues
If this project helps you, please give it a ⭐️ Star!
