WrenAI
⚡️ GenBI (Generative BI) queries any database in natural language, generates accurate SQL (Text-to-SQL), charts (Text-to-Chart), and AI-powered business intelligence in seconds.
Ask AI about WrenAI
Powered by Claude · Grounded in docs
I know everything about WrenAI. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
WrenAI — Open Context Layer for AI Agents
📣 2026-05-07 — Wren Engine has merged into this repo under
core/. The previousCanner/wren-enginerepo is archived. The previous WrenAI GenBI app is preserved on thelegacy/v1branch (tagv1-final). Read the announcement →
Why WrenAI?
AI agents fail on business data not because they can't write SQL — they fail because they don't know what your warehouse means. Overlapping tables, inconsistent naming, metric definitions scattered across dashboards and SQL files: an LLM with raw database access guesses just as badly as a new hire on day one.
WrenAI is the open context layer that fills that gap. You model your business in MDL (Modeling Definition Language) — entities, relationships, calculations, governed access patterns — and any agent (Claude, Cursor, ChatGPT, internal copilots, customer-facing apps) queries through the same layer your analysts already use.
A Rust engine powered by Apache DataFusion translates the modeled SQL and runs it against 20+ data sources (PostgreSQL, BigQuery, Snowflake, Spark, etc.). Use it as a Python SDK, a CLI, a WASM module in the browser, or as building blocks for agent skills.
Quick start
The fastest path is to let an AI coding agent (Claude Code, Cursor, Aider, etc.) drive the install:
# Install WrenAI skills into your AI agent
npx skills add Canner/WrenAI --skill '*'
Start a new agent session and ask:
Use the
wren-onboardingskill to install and set up Wren AI Core.
The wren-onboarding skill walks the agent through environment checks, package install, project scaffolding, the first data source connection, and a first query.
Full CLI guide and manual install steps: core/wren/README.md. Installable extras for each connector are listed there.
Supported Data Sources
Wren Engine is built to work across modern data stacks, including warehouses, databases, and file-based sources.
Current open source support includes connectors such as:
- Amazon S3
- Apache Spark
- Apache Doris
- Athena
- BigQuery
- ClickHouse
- Databricks
- DuckDB
- Google Cloud Storage
- Local files
- MinIO
- MySQL
- Oracle
- PostgreSQL
- Redshift
- SQL Server
- Snowflake
- Trino
See the connector API docs in the project documentation for the latest connection schemas and capabilities.
Repository map
| Path | What's there |
|---|---|
core/ | Rust engine + Python/WASM bindings + CLI. The context layer's core machinery. |
core/wren-core/ | Rust semantic engine (Cargo workspace). |
core/wren-core-base/ | Manifest types (Model, Column, Cube, Relationship, View). |
core/wren-core-py/ | PyO3 bindings (PyPI: wren-core). |
core/wren-core-wasm/ | WebAssembly build for in-browser semantic SQL (npm: wren-core-wasm). |
core/wren/ | Python SDK + wren CLI (PyPI: wren-engine). |
core/wren-mdl/ | MDL JSON schema. |
skills/ | CLI-based agent skills (wren-generate-mdl, wren-usage, wren-dlt-connector, wren-onboarding). |
sdk/ | Framework integrations. sdk/wren-langchain/ (PyPI: wren-langchain) is shipped; CrewAI / Pydantic-AI / Goose / LlamaIndex / Mastra are coming soon. |
examples/ | End-to-end example projects — coming soon. |
docs/core/ | Module documentation. |
Community
- Discussions: github.com/Canner/WrenAI/discussions
- Issues: github.com/Canner/WrenAI/issues
- Discord: discord.gg/canner
- Docs site: docs.getwren.ai
License
WrenAI is multi-licensed:
core/**,sdk/**,skills/**,examples/**, root-level files — Apache License 2.0docs/**— Creative Commons Attribution 4.0 International (CC BY 4.0)
Future modules may be introduced under GNU Affero General Public License v3.0; the full text is committed here pre-emptively. See LICENSE for the authoritative path-to-license map.
Published packages declare their effective license in their package manifest (Cargo.toml, pyproject.toml, package.json).
