FastMCP X
No description available
Ask AI about FastMCP X
Powered by Claude Β· Grounded in docs
I know everything about FastMCP X. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
FastMCP-x: Enterprise Document-Aware Query Assistant
A full-stack MCP application with AI-powered semantic search, pgvector database indexing, and modern web interface. Production-ready for knowledge management at scale.
π― Key Features
- π Multi-Format Ingestion: PDF, DOCX, PPTX, XLS/XLSX, CSV, TXT
- π§ pgvector Semantic Search: Enterprise-scale database-side similarity search (<10ms queries)
- π¬ AI Responses: Context-aware LLM answers using Ollama
- π Structured Data Queries: Natural language for Excel/CSV files
- π Web Search: Tavily API integration
- π Enterprise Auth: Supabase magic links
- π¨ Modern UI: Next.js with real-time chat interface
ποΈ Architecture
Next.js Frontend (3000)
β HTTP
Bridge Server (3001)
β MCP Protocol
FastMCP Server (8000) + PostgreSQL/pgvector
Quick Overview
- Frontend: Next.js 14, TypeScript, Tailwind CSS, Supabase Auth
- Backend: FastMCP, FastAPI, sentence-transformers, Ollama
- Database: Supabase PostgreSQL with pgvector extension
- Search: 384-dimensional embeddings with IVFFLAT indexing
π¦ Prerequisites
- Python 3.9+
- Node.js 18+
- Ollama (local LLM inference)
- Supabase account (auth + database)
- Docker & Docker Compose (for containerized deployment)
β‘ Quick Start
Option 1: Docker (Recommended)
Easiest way to run the entire stack!
# 1. Configure environment (root .env)
# Required:
# SUPABASE_URL=...
# SUPABASE_ANON_KEY=...
# SUPABASE_SERVICE_ROLE_KEY=...
# Optional:
# TAVILY_API_KEY=...
# OLLAMA_MODEL=llama3.2:3b
# 2. Start all services
docker compose up --build -d
# 3. Access applications
# Frontend: http://localhost:3000
# Bridge Docs: http://localhost:3001/docs
# Backend SSE: http://localhost:8000/sse
# Ollama: http://localhost:11434
Development with hot reload:
docker compose -f docker-compose.dev.yml up --build
See BRIDGE_SERVER.md for full documentation.
Option 2: Local Development
1. Backend Setup
pip install -r requirements.txt
ollama pull llama3.2:3b
2. Configure Environment
Create .env in project root:
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
OLLAMA_HOST=http://localhost:11434
Frontend: Create frontend/.env.local:
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your-anon-key
NEXT_PUBLIC_BRIDGE_SERVER_URL=http://localhost:3001
3. Start All Services
Manual (4 terminals):
# Terminal 1: Ollama
ollama serve
# Terminal 2: FastMCP Server
python server/main.py
# Terminal 3: Bridge Server
python bridge_server.py
# Terminal 4: Frontend
cd frontend && npm run dev
Visit http://localhost:3000
ποΈ Project Structure
FastMCP-x/
βββ server/
β βββ main.py # MCP tools registration
β βββ query_handler.py # pgvector semantic search
β βββ document_ingestion.py # File processing + embeddings
β βββ csv_excel_processor.py # Structured data queries
β βββ web_search_file.py # Web search integration
β βββ agent.py # Agent orchestration
β βββ instructions.py # Instruction handling
βββ frontend/
β βββ app/components/ # Chat, Sidebar, Auth UI
β βββ lib/supabase/ # Database service layer
β βββ middleware.ts # Auth middleware
βββ bridge_server.py # FastAPI bridge (MCP client)
βββ utils/file_parser.py # Document extraction
βββ client/fast_mcp_client.py # CLI test client
βββ documentations/ # Guides + architecture
π MCP Tools Available
| Tool | Purpose |
|---|---|
ingest_file_tool | Upload and process documents |
answer_query_tool | Query with semantic search |
query_excel_with_llm_tool | Natural language on Excel |
query_csv_with_llm_tool | Natural language on CSV |
web_search_tool | Web search + LLM summary |
answer_link_query_tool | Extract and analyze URLs |
πΎ Database Schema
Core Tables
- files: Document metadata (id, filename, file_type, size_bytes, etc.)
- document_content: Extracted text from files
- document_embeddings: 384-dim vectors (pgvector type) with IVFFLAT index
- workspaces: User workspace organization
- chats: Conversation history
- users (via Supabase Auth): User authentication
Key Feature: pgvector
-- Enterprise semantic search at database level
SELECT * FROM document_embeddings
ORDER BY embedding <=> query_embedding -- <=> = cosine distance operator
LIMIT 5;
-- Indexed for sub-10ms queries
CREATE INDEX ON document_embeddings USING ivfflat (embedding vector_cosine_ops);
π How It Works
Document Upload
1. File uploaded
2. Text extracted via file_parser
3. Split into 600-char chunks with 50-char overlap
4. Each chunk embedded (384 dims) using sentence-transformers
5. Embeddings stored in pgvector table with IVFFLAT index
Query Processing
1. User query β Embed to 384-dim vector
2. pgvector RPC: find similar embeddings via <=> operator
3. Top-K chunks returned (<10ms)
4. LLM answers using chunks as context
5. Response with source attribution
Performance
| Metric | Value |
|---|---|
| Query Latency | <10ms |
| Memory Overhead | 0 MB |
| Max Documents | Unlimited |
| Startup Time | <1s |
| Model | all-MiniLM-L6-v2 (384 dims) |
π οΈ Development
Frontend
cd frontend
npm install # Install dependencies
npm run dev # Dev server (hot reload)
npm run build # Production build
npm run type-check # TypeScript validation
npm run lint # ESLint checks
Backend
python server/main.py # Start FastMCP server
python bridge_server.py # Start bridge server
python client/fast_mcp_client.py # Test CLI client
Adding Features
New File Format:
- Update
utils/file_parser.pywith extraction logic - Add dependencies to
requirements.txt
New MCP Tool:
- Create function in appropriate module
- Register with
@mcp.toolinserver/main.py
New Frontend Component:
- Create in
frontend/app/components/ - Follow TypeScript + Tailwind patterns
- Test accessibility
π Authentication
- Provider: Supabase Auth
- Method: Magic links (email)
- Setup: Add
http://localhost:3000/auth/callbackto Supabase redirect URLs
Magic Link Login Flow:
1. User enters email
2. Supabase sends magic link
3. User clicks link β redirects to /auth/callback
4. Middleware validates session
5. Redirects to /dashboard
π Documentation
| DBRIDGE_SERVER.md| Bridge server architecture | |ARCHITECTURE.md| Full system architecture overview | |.github/copilot-instructions.md| AI coding guidelines | |WORKSPACE_SCHEMA_GUIDE.md| Database schema detailng | |SETUP.md| Detailed setup for developers | |BRIDGE_SERVER.md| Bridge server architecture | |.github/copilot-instructions.md` | AI coding guidelines |
βοΈ Configuration
Environment Variables
Backend (.env):
NEXT_PUBLIC_SUPABASE_URL # Supabase project URL
SUPABASE_SERVICE_ROLE_KEY # For backend operations
OLLAMA_HOST # Ollama endpoint (default: http://localhost:11434)
TAVILY_API_KEY # For web search
Frontend (frontend/.env.local):
NEXT_PUBLIC_SUPABASE_URL # Supabase project URL
NEXT_PUBLIC_SUPABASE_ANON_KEY # Public anon key
NEXT_PUBLIC_BRIDGE_SERVER_URL # Bridge server URL
π Troubleshooting
| Issue | Solution |
|---|---|
| Ollama not found | Install from https://ollama.ai and run ollama serve |
| Port already in use | Change with npm run dev -- -p 3001 |
| Auth redirect fails | Add redirect URL to Supabase |
| No embeddings | Check pgvector enabled: SELECT * FROM pg_extension WHERE extname='vector' |
| Slow queries | Verify IVFFLAT index: SELECT * FROM pg_indexes WHERE tablename='document_embeddings' |
| TypeScript errors | Run npm install in frontend directory |
π Frontend Components
Sidebar
- Collapsible (256px β 64px)
- Persistent state (localStorage)
- Smooth animations (300ms)
- Keyboard navigation + accessibility
Chat Interface
- Message history display
- Real-time message streaming (UI ready)
- File attachment support
- Keyboard shortcuts (Cmd/Ctrl+Enter)
Authentication
- Magic link login
- Session management
- Protected routes
- User profile display
π’ Deployment
Production Checklist
- pgvector enabled in Supabase
- Environment variables configured
- Ollama running on deployment server
- CORS configured for frontend domain
- IVFFLAT index created on embeddings table
- Supabase backup configured
- Rate limiting enabled
- Monitoring/logging set up
π Contributing
- Create feature branch:
git checkout -b feature/name - Follow PEP8 (Python) and ESLint (TypeScript)
- Add tests for new features
- Update documentation
- Submit PR with description
π License
MIT
