io.github.amer-prog/gamma-mcp-server
Create presentations, documents, and webpages from any MCP-compatible AI assistant via Gamma.app
Ask AI about io.github.amer-prog/gamma-mcp-server
Powered by Claude Β· Grounded in docs
I know everything about io.github.amer-prog/gamma-mcp-server. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Gamma MCP Server
A Model Context Protocol (MCP) server that integrates Gamma.app with AI assistants. Create presentations, documents, webpages, and social posts directly from your AI conversations.
Works with: Claude Code, Claude Desktop, OpenCode, GitHub Copilot CLI, Google Gemini CLI, and any other MCP-compatible AI assistant.
Features
- Generate Content: Create professional presentations, documents, webpages, and social posts from text prompts
- Theme Support: Browse and apply visual themes to your content
- Folder Organization: Save generated content to specific folders
- Template Remix: Create variations of existing Gamma templates
- Email Sharing: Share generated content directly via email
Quick Start
1. Clone and Install
git clone https://github.com/Arkava-AI/gamma-mcp-server.git
cd gamma-mcp-server
npm install
npm run build
2. Get Your Gamma API Key
- Log in to gamma.app
- Go to Settings > API (or Settings > Members > API tab)
- Click Create API key
- Copy the key (format:
sk-gamma-xxxxxxxx)
Note: Requires Gamma Pro, Ultra, Team, or Business account.
3. Configure Your AI Assistant
Choose your AI assistant below for setup instructions.
Claude Desktop (macOS / Windows / Linux)
Config File
| OS | Path |
|---|---|
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
| Linux | ~/.config/Claude/claude_desktop_config.json |
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
Add to the mcpServers object:
{
"mcpServers": {
"gamma": {
"command": "node",
"args": ["/absolute/path/to/gamma-mcp-server/dist/index.js"],
"env": {
"GAMMA_API_KEY": "sk-gamma-your-api-key-here"
}
}
}
}
Restart Claude Desktop
Restart Claude Desktop to load the new MCP server. You should see "gamma" in your MCP servers list.
Claude Code
Claude Code uses the same MCP configuration as Claude Desktop. If you've already configured Claude Desktop, you're all set.
For project-level configuration, create a .claude/settings.json file in your project directory with the same mcpServers structure shown above. This allows different projects to use different MCP server configurations.
OpenCode
Config File
| Scope | Path |
|---|---|
| Global (user) | ~/.config/opencode/opencode.json |
| Project | ./opencode.json (in your project root) |
JSON Structure
{
"mcp": {
"gamma": {
"type": "local",
"command": ["node", "/absolute/path/to/gamma-mcp-server/dist/index.js"],
"enabled": true,
"environment": {
"GAMMA_API_KEY": "sk-gamma-your-api-key-here"
}
}
}
}
Note: OpenCode uses a different config format from Claude Desktop β
mcp(notmcpServers),typefield required ("local"or"remote"),commandis an array, and env vars go underenvironment.
Restart OpenCode after editing the config to load the server.
GitHub Copilot CLI
Config File
~/.copilot/mcp-config.json
JSON Structure
{
"mcpServers": {
"gamma": {
"type": "local",
"command": "node",
"args": ["/absolute/path/to/gamma-mcp-server/dist/index.js"],
"env": {
"GAMMA_API_KEY": "sk-gamma-your-api-key-here"
},
"tools": ["*"]
}
}
}
Note: Requires the GitHub Copilot CLI (
gh copilot) β not the same as OpenAI Codex.
OpenAI Codex
Config File
~/.codex/config.toml(TOML format, not JSON)
TOML Structure
[mcp_servers.gamma]
command = "node"
args = ["/absolute/path/to/gamma-mcp-server/dist/index.js"]
enabled = true
[mcp_servers.gamma.env]
GAMMA_API_KEY = "sk-gamma-your-api-key-here"
Note: Codex uses TOML format, not JSON. The
envsection is a separate table under[mcp_servers.gamma.env].
Google Gemini CLI
Config File
| Scope | Path |
|---|---|
| User | ~/.gemini/settings.json |
| Project | .gemini/settings.json (in your project root) |
JSON Structure
{
"mcpServers": {
"gamma": {
"command": "node",
"args": ["/absolute/path/to/gamma-mcp-server/dist/index.js"],
"cwd": "/absolute/path/to/gamma-mcp-server",
"env": {
"GAMMA_API_KEY": "sk-gamma-your-api-key-here"
},
"timeout": 30000
}
}
}
Restart Gemini CLI after editing the config to load the server.
Available Tools
| Tool | Description |
|---|---|
gamma_generate | Create presentations, documents, webpages, or social posts |
gamma_get_status | Check generation progress (with optional polling) |
gamma_from_template | Remix existing Gamma templates |
gamma_list_themes | Browse available visual themes |
gamma_list_folders | List your Gamma folders |
gamma_share_email | Share content via email |
gamma_health | Verify server and API are reachable |
gamma_archive | Archive a Gamma from your workspace |
gamma_generate
Create new content using Gamma's AI.
Formats & Sizes:
presentation: fluid, 16x9, 4x3document: fluid, pageless, letter, a4social: 1x1, 4x5, 9x16webpage: fluid
Example prompts in your AI assistant:
- "Create a 5-slide presentation about sustainable energy"
- "Generate a document explaining our Q1 results"
- "Make a social media post announcing our new product"
gamma_get_status
Check if a generation has completed. Set waitForCompletion: true to automatically poll until done.
gamma_from_template
Remix an existing Gamma with new content or variable substitutions.
{
"templateId": "gamma_xyz789",
"prompt": "Update for Q1 2025",
"variables": { "company_name": "Acme Corp" }
}
Multi-Machine Setup
This repository is designed for easy deployment across multiple machines:
# On each machine:
git clone https://github.com/Arkava-AI/gamma-mcp-server.git
cd gamma-mcp-server
npm install && npm run build
# Then configure your AI assistant with the local path
To update on any machine:
git pull
npm install
npm run build
# Restart your AI assistant
Environment Variables
| Variable | Default | Description |
|---|---|---|
GAMMA_API_KEY | (required) | Your Gamma API key (sk-gamma-...) |
GAMMA_API_BASE_URL | https://public-api.gamma.app/v1.0 | Override for self-hosted Gamma instances |
GAMMA_POLL_INTERVAL_MS | 2000 | Milliseconds between status polls (default 2s) |
GAMMA_MAX_POLL_ATTEMPTS | 150 | Max polling attempts before timeout (default 150 Γ 2s = 5 min) |
Development
# Run in development mode with auto-reload
npm run dev
# Build for production
npm run build
# Run linting
npm run lint
# Format code
npm run format
# Type check
npm run typecheck
# Test with MCP Inspector
npm run inspect
Project Structure
gamma-mcp-server/
βββ src/
β βββ index.ts # Main entry point
β βββ constants.ts # Configuration constants
β βββ types.ts # TypeScript interfaces
β βββ schemas/ # Zod validation schemas
β βββ services/ # API client and formatters
β βββ tools/ # MCP tool implementations
βββ dist/ # Compiled JavaScript (generated)
βββ package.json
βββ tsconfig.json
βββ eslint.config.js
API Credits
Gamma uses a credit-based system for API usage. Credits are consumed per generation. Monitor your usage in the Gamma dashboard and enable auto-recharge if needed.
Troubleshooting
| Error | Solution |
|---|---|
| "GAMMA_API_KEY environment variable is required" | Ensure env.GAMMA_API_KEY is set in your AI assistant's MCP config |
| "Invalid API key" | Keys should start with sk-gamma-. Verify the complete key. |
| "Rate limit exceeded" | Wait a few minutes. Contact Gamma support for higher limits. |
| "Insufficient credits" | Top up credits or enable auto-recharge in Gamma settings. |
Requirements
- Node.js 18+
- Gamma Pro/Ultra/Team/Business account (for API access)
- MCP-compatible AI assistant (Claude Code, Claude Desktop, OpenCode, GitHub Copilot CLI, Gemini CLI, etc.)
Maintainer
Arkava Ltd β engage@arkava.ai
License
MIT License - see LICENSE for details.
