Nac
No description available
Ask AI about Nac
Powered by Claude · Grounded in docs
I know everything about Nac. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
nac
Small coding agent. Heavily inspired by slate. Also takes inspiration from nanocode and pi.
Install the latest edge build:
curl -fsSL https://raw.githubusercontent.com/sapiosaturn/nac/main/scripts/install.sh | sh
Pinned version installs are not supported yet.
Set OPENAI_API_KEY, then run nac. Use nac --compact for the compact single-column TUI, or nac --full to override a compact config default.
To use ChatGPT Codex auth instead of an OpenAI API key, run nac codex-auth login and complete the device-code flow in a browser. Launch with nac --backend chatgpt-codex-responses, or configure backend = "chatgpt-codex-responses" under [model].
Optional:
OPENAI_BASE_URLOPENAI_MODEL
Linux installs use the portable static build.
Upgrade to the latest edge build:
nac upgrade
AGENTS.md is loaded hierarchically from the project and globally from NAC_HOME / ~/.config/nac. Skills are discovered from project and user skill directories and activated from workers with activate_skill(...). Sessions are stored in the project store (.nac/store.db by default): use nac resume for the picker, nac resume --last for the newest session, or nac resume SESSION_ID for a specific session. Thread history does not auto-compact right now.
Uninstall:
curl -fsSL https://raw.githubusercontent.com/sapiosaturn/nac/main/scripts/uninstall.sh | sh
nac can run tools inside a Podman sandbox (requires Podman to be installed):
nac --sandbox
By default this mounts the current directory into the sandbox at /workspace.
For a custom setup:
--no-mount-cwddisables the default current-directory mount--mount HOST:GUESTadds a read-write mount--mount-ro HOST:GUESTadds a read-only mount--sandbox-image IMAGEoverrides the default image (python:3.13-bookworm)
On macOS, start Podman first:
podman machine init
podman machine start
Recommended config
Optional config lives at ~/.config/nac/config.toml, or at $NAC_HOME/config.toml when NAC_HOME is set. Explicit CLI args and environment variables override TOML defaults. Resumed sessions continue using the model and sandbox settings stored in their session snapshot.
The api_key_env setting names the environment variable to read when OPENAI_API_KEY is not set. Store paths remain relative to the launch working directory.
[agents_md]
fallback_filenames = []
max_bytes = 4194304
[ui]
mode = "full"
[storage]
store_path = ".nac/store.db"
[model]
backend = "openai-responses"
model = "gpt-5.5"
base_url = "https://api.openai.com/v1"
reasoning_effort = "xhigh"
api_key_env = "OPENAI_API_KEY"
[sandbox]
image = "python:3.13-bookworm"
[worker]
thread_timeout_secs = 3600
[mcp_servers.exa_web_search]
enabled = true
transport = "streamable_http"
url = "https://mcp.exa.ai/mcp"
[mcp_servers.context7]
enabled = true
transport = "streamable_http"
url = "https://mcp.context7.com/mcp"
[mcp_servers.grep_app]
enabled = true
transport = "streamable_http"
url = "https://mcp.grep.app"
Supported MCP transports right now are stdio and streamable_http. Stdio servers can provide command, args, and env; streamable HTTP servers provide url and optional headers. MCP string values support ${ENV_VAR} expansion.
For ChatGPT Codex auth, the default base URL is https://chatgpt.com/backend-api; NAC sends non-streaming Responses requests to /codex/responses. Use nac codex-auth status to inspect the saved account and nac codex-auth logout to remove local tokens.
