AI Data Analyst MCP Server
An MCP server that gives AI agents the power to connect to any database, explore schemas, run secure SQL queries, profile data, generate interactive Plotly charts, and upload datasets β all through the Model Context Protocol. Built with FastMCP, SQLAlchemy, and Pandas. Supports SQLite, PostgreSQL, MySQL, and more.
Ask AI about AI Data Analyst MCP Server
Powered by Claude Β· Grounded in docs
I know everything about AI Data Analyst MCP Server. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
π€ AI Data Analyst β MCP Server
An AI-powered data analysis server built on the Model Context Protocol (MCP) using FastMCP. Gives AI agents a rich, standardized toolset to connect to databases, explore schemas, execute secure SQL queries, profile data, generate interactive visualizations, and manage datasets β all through a single, consistent interface.
π Table of Contents
- Features
- Project Structure
- Prerequisites
- Installation
- Quick Start
- Adding to Claude Desktop
- MCP Tools Reference
- Supported Databases
- Security
- License
β¨ Features
| Feature | Description |
|---|---|
| Database Connectivity | Connect to any SQLAlchemy-supported database β SQLite, PostgreSQL, MySQL, MSSQL, and more |
| Schema Exploration | Inspect tables, columns, primary keys, foreign keys, and indexes with a single tool call |
| Secure SQL Execution | Read-only query validation blocks all destructive operations before they reach the database |
| Data Profiling | Automated per-column statistics: null rates, unique counts, min/max, mean, and top values |
| Smart Chart Generation | Auto-detects the best chart type and aggregation; outputs interactive Plotly visualizations |
| Dataset Upload | Import CSV, Excel, and JSON files directly as fully queryable SQL tables |
| Session Management | Isolated, per-session database engines and caches keep concurrent analyses independent |
| Caching | 5-minute TTL in-memory cache for schemas, profiles, and analysis results |
π Project Structure
ai-analyst/
βββ server.py # FastMCP server entry point & tool registration
βββ create_test_db.py # Generates a sample SQLite e-commerce database
β
βββ database/
β βββ connection_manager.py # SQLAlchemy connection management (singleton)
β βββ schema_loader.py # Schema inspection & caching
β
βββ datasets/
β βββ dataset_manager.py # CSV / Excel / JSON upload & query engine
β
βββ security/
β βββ sql_validator.py # SQL whitelist validator (SELECT / WITH only)
β
βββ tools/
β βββ analyze_schema.py # Schema relationship analysis tool
β βββ connect_database.py # Database connection tool
β βββ dataset_tools.py # Dataset upload / list / query tools
β βββ generate_chart.py # Chart generation tool
β βββ get_schema.py # Schema fetching tool
β βββ get_schema2.py # Schema fetching β plain-text format (optional)
β βββ profile_table.py # Data profiling tool
β βββ run_query.py # SQL query execution tool
β βββ sample_rows.py # Table row sampling tool
β βββ session_tools.py # Session creation tool
β
βββ utils/
β βββ cache_manager.py # In-memory TTL cache
β βββ chart_utils.py # Plotly chart creation & auto-detection
β βββ data_profiler.py # SQL / pandas data profiling engine
β βββ generate_chart.py # Chart generation wrapper
β βββ schema_analyzer.py # Relationship & index analysis
β βββ session_manager.py # UUID session management (singleton)
β βββ sql_generator.py # LLM-based SQL generation
β
βββ logs/ # Runtime log files
π Prerequisites
- Python 3.11
π¦ Installation
1. Clone the repository
git clone <repo-url>
cd ai-analyst
2. Create and activate a virtual environment
python -m venv venv
# macOS / Linux
source venv/bin/activate
# Windows
venv\Scripts\activate
3. Install dependencies
pip install -r requirements.txt
π Quick Start
(Optional) Generate the sample database
python create_test_db.py
Creates sample_store.db β a ready-to-query SQLite e-commerce database containing customers, products, orders, and order_items tables, preloaded with sample data.
Start the MCP server
python server.py
π₯ Adding to Claude Desktop
- Open Claude Desktop and go to Settings β Developer
- Click Edit Config to open
claude_desktop_config.json - Add the following entry inside
"mcpServers":
"AI Data Analyst MCP Server": {
"command": "<path-to-your-python>",
"args": [
"<path-to-project>/server.py"
],
"env": {},
"transport": "stdio"
}
π₯ Adding to your own chatbot
SERVERS = {
"AI Data Analyst MCP Server": {
"command": "<path-to-your-python>",
"args": [
"<path-to-project>/server.py"
],
"env": {},
"transport": "stdio"
}
}
In your chatbot code, initialize the client:
from langchain_mcp_adapters.client import MultiServerMCPClient
client = MultiServerMCPClient (SERVERS)
Add in your chatbot's llm tools:
tools = client.get_tools()
llm_with_tools = llm.bind_tools(tools)
Now your chatbot can call any of the MCP tools defined in server.py!
Replace the placeholder values:
| Placeholder | Example |
|---|---|
<path-to-your-python> | C:\Users\you\ai-analyst\venv\Scripts\python.exe (Windows) or /home/you/ai-analyst/venv/bin/python (macOS/Linux) |
<path-to-project> | C:\Users\you\ai-analyst or /home/you/ai-analyst |
- Save the file and restart Claude Desktop.
π§ MCP Tools Reference
| Tool | Description |
|---|---|
hello | Health check β confirms the server is running |
create_session | Creates a new isolated analysis session |
connect_database_tool | Connects to a database using a SQLAlchemy URI |
get_schema | Retrieves the table and column schema for the connected database |
sample_rows_tool | Samples up to 50 rows from a specified table |
run_query_tool | Executes a validated, read-only SQL query |
profile_table | Generates column-level statistics for a table |
analyze_schema | Detects table relationships, keys, and indexes |
generate_chart_tool | Creates an interactive Plotly chart with automatic type detection |
upload_dataset | Uploads a CSV, Excel, or JSON file as a queryable SQL table |
list_datasets | Lists all datasets uploaded in the current session |
query_dataset | Runs SQL against previously uploaded datasets |
π Supported Databases
Connect to any SQLAlchemy-compatible database using a standard URI:
# SQLite
sqlite:///path/to/database.db
# PostgreSQL
postgresql://user:password@host:port/dbname
# MySQL
mysql://user:password@host:port/dbname
# Microsoft SQL Server
mssql+pyodbc://user:password@host/dbname?driver=ODBC+Driver+17+for+SQL+Server
π Security
The server enforces strict read-only SQL execution to protect your data.
Allowed statements
SELECTWITH(Common Table Expressions / CTEs)
Blocked keywords
DROP Β· DELETE Β· UPDATE Β· ALTER Β· INSERT Β· TRUNCATE Β· MERGE Β· GRANT Β· REVOKE
Additional protections
- Multi-statement queries (
;-separated) are rejected outright - Validation is case-insensitive and uses regex word boundaries to prevent bypass attempts
π License
This project is provided as-is for educational and research purposes.
