Gemini CLI Rs
Gemini CLI in Rust β Google AI Studio API with MCP server support for Claude Code
Ask AI about Gemini CLI Rs
Powered by Claude Β· Grounded in docs
I know everything about Gemini CLI Rs. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
gemini-cli-rs
English | ζ₯ζ¬θͺ
Gemini CLI written in Rust, optimized for use as an MCP tool in Claude Code.
Features
- MCP Server mode β Runs as a native MCP server (
--mcp-server); integrates directly with Claude Code - SSE Streaming β Streams responses from the Gemini API in real time
- Google Search Grounding β Always-on; sources are printed to stderr in CLI mode and included in the response in MCP mode
- GEMINI.md Context β Loads project-specific context from
GEMINI.md(walks up to the nearest.gitroot) - Single-shot mode β Designed for non-interactive, scriptable use
Prerequisites
- A Gemini API key (
GEMINI_API_KEY) β get one at Google AI Studio
Installation
Download pre-built binary
Pre-built binaries are available for macOS and Linux on the Releases page.
# Apple Silicon macOS
curl -L https://github.com/heki1224/gemini-cli-rs/releases/latest/download/gemini-cli-rs-aarch64-apple-darwin.tar.xz | tar -xJ
mv gemini ~/.local/bin/gemini
# Intel macOS
curl -L https://github.com/heki1224/gemini-cli-rs/releases/latest/download/gemini-cli-rs-x86_64-apple-darwin.tar.xz | tar -xJ
mv gemini ~/.local/bin/gemini
# Linux x86_64
curl -L https://github.com/heki1224/gemini-cli-rs/releases/latest/download/gemini-cli-rs-x86_64-unknown-linux-gnu.tar.xz | tar -xJ
mv gemini ~/.local/bin/gemini
Build from source
git clone https://github.com/heki1224/gemini-cli-rs
cd gemini-cli-rs
cargo build --release
cp target/release/gemini ~/.local/bin/gemini
Usage
CLI mode
# Set your API key
export GEMINI_API_KEY="your-api-key"
# Send a prompt
gemini -p "What is the capital of France?"
# Use a different model
gemini -m gemini-2.5-pro -p "Explain Rust's borrow checker"
Grounding sources are printed to stderr after the response.
MCP server mode
In MCP mode the binary is launched automatically by Claude Code β you do not normally run it directly. See MCP Setup for registration steps.
# Manual launch (for debugging)
GEMINI_API_KEY="your-api-key" gemini --mcp-server
Grounding sources are included in the response text (not stderr).
Options
| Flag | Description | Default |
|---|---|---|
-p, --prompt | Prompt to send (required in CLI mode) | β |
-m, --model | Model to use | gemini-3-flash-preview |
--mcp-server | Run as MCP server (JSON-RPC 2.0 over stdio) | β |
The API key is read from the GEMINI_API_KEY environment variable only (no --api-key flag).
Environment variables
| Variable | Description | Default |
|---|---|---|
GEMINI_API_KEY | Gemini API key (required) | β |
GEMINI_DEFAULT_MODEL | Override the default model at runtime | gemini-3-flash-preview |
GEMINI_HIGH_PERF_MODEL | Override the high-performance model at runtime | gemini-3.1-pro-preview |
MCP Setup (Claude Code)
This tool is designed to be used as an MCP server inside Claude Code, providing Gemini with Google Search Grounding as a secondary AI assistant.
1. Build the binary
cargo build --release
2. Register as an MCP server
claude mcp add gemini /path/to/target/release/gemini --scope user -- --mcp-server
Use --scope user to make it available across all projects, or --scope project to limit it to the current project.
3. Set your API key
Add GEMINI_API_KEY to your shell profile (~/.zshrc, ~/.bashrc, etc.):
export GEMINI_API_KEY="your-api-key"
4. Restart Claude Code and verify
claude mcp list
Once registered, the ask_gemini_mcp tool is available. Claude Code routes prompts to Gemini with real-time Google Search grounding.
MCP tool parameters
| Parameter | Type | Description |
|---|---|---|
prompt | string (required) | The prompt to send to Gemini. Aliases request, query, message, text, input are also accepted as fallbacks |
model | string | Override the model for this request (ignored when thinking=true) |
thinking | boolean | Use the high-performance model (gemini-3.1-pro-preview) for complex reasoning or deep analysis. Default: false |
Note: The binary writes protocol messages to stdout and all logs to stderr. Do not redirect stderr if you want to see debug output.
GEMINI.md
Place a GEMINI.md file in your current working directory (or any parent directory) to inject system-level context into every request. The search walks up to the nearest .git directory; if no .git is found, it continues to the filesystem root. Useful for project-specific instructions.
Note: Files larger than 1 MB are silently ignored.
your-project/
βββ .git/
βββ GEMINI.md β loaded automatically
βββ src/
License
MIT β see LICENSE
