Cursor Kg
High-performance Knowledge Graph MCP Server for Cursor IDE with local embeddings, advanced search, and comprehensive security features
Ask AI about Cursor Kg
Powered by Claude ยท Grounded in docs
I know everything about Cursor Kg. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
๐ง Knowledge Graph MCP Server
A blazingly fast, local-first Knowledge Graph server that connects to Cursor IDE. Think of it as your personal AI memory that gets smarter as you code, without sending your data anywhere.
โจ What This Does
- ๐ 10-40x Faster than other knowledge graph solutions
- ๐ 100% Local - Your code never leaves your machine (no API keys needed!)
- ๐ง Smart Memory - Remembers your conversations, decisions, and code patterns
- โก Real-time - Syncs instantly with Cursor IDE as you work
- ๐ Powerful Search - Find anything across your entire codebase and conversations
- ๐ก๏ธ Secure - Built-in authentication and input validation
๐ Quick Start (10 Minutes to Running)
Prerequisites
- Rust (we'll install this for you)
- macOS, Linux, or Windows
- Cursor IDE (recommended) or any MCP-compatible editor
Step 1: Get the Code
git clone https://github.com/Nonymaus/cursor-kg.git
cd cursor-kg
Step 2: Install Rust (if you don't have it)
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source ~/.cargo/env
Step 3: Build and Run
# Build the server (takes 2-3 minutes first time)
cargo build --release
# Start the server
cargo run --release
That's it! The server is now running on http://localhost:8360 ๐
๐ง Connect to Cursor IDE
Automatic Setup (Recommended)
# This creates the config file for you
echo '{
"mcpServers": {
"cursor-kg": {
"url": "http://localhost:8360/sse"
}
}
}' > ~/.cursor/mcp.json
Manual Setup
- Open Cursor IDE
- Go to Settings โ MCP Servers
- Add this configuration:
{ "mcpServers": { "cursor-kg": { "url": "http://localhost:8360/sse" } } }
Test the Connection
In Cursor, try asking: "What's in my knowledge graph?"
If it works, you'll see a response from the server! ๐
โ๏ธ Configuration Options
All settings are in config.toml. Here are the most important ones:
# Basic Settings
[database]
filename = "knowledge_graph.db" # Where your data is stored
[embeddings]
model_name = "nomic-embed-text-v1.5" # AI model for understanding text
batch_size = 16 # How many texts to process at once
# Security (optional)
[security]
enable_authentication = false # Set to true for API key protection
api_key = "" # Your secret key (if auth enabled)
rate_limit_requests_per_minute = 60 # Prevent spam
# Performance
[memory]
max_cache_size_mb = 128 # How much RAM to use for caching
๐ก Tip: The defaults work great for most people. Only change these if you know what you're doing!
๐ฎ How to Use It
Basic Commands
# Start the server
cargo run --release
# Start with debug info (if something's wrong)
RUST_LOG=debug cargo run --release
# Run on a different port
MCP_PORT=9000 cargo run --release
# Check if it's working
curl http://localhost:8360/health
# Should return: {"status":"ok"}
What You Can Do
Once connected to Cursor, you can:
๐ฌ Ask Questions
- "What did we discuss about the authentication system?"
- "Show me all the functions related to database queries"
- "What are the main components of this project?"
๐ Add Information
- "Remember that we decided to use SQLite for the database"
- "Add this code pattern to the knowledge graph"
- "Store this meeting summary"
๐ Search & Analyze
- "Find similar code patterns"
- "What are the dependencies between these files?"
- "Show me the project structure"
Advanced Usage
๐ Enable Security (for production):
# In config.toml
[security]
enable_authentication = true
api_key = "your-secret-key-here"
๐ณ Run with Docker:
docker build -t cursor-kg .
docker run -p 8360:8360 cursor-kg
๐ Monitor Performance:
# Check server stats
curl http://localhost:8360/metrics
๐จ Troubleshooting
Server Won't Start
# Check if port is already in use
lsof -i :8360
# Try a different port
MCP_PORT=8361 cargo run --release
# Check for errors
RUST_LOG=debug cargo run --release
Cursor Can't Connect
- Check the server is running: Visit http://localhost:8360/health
- Verify Cursor config: Make sure
~/.cursor/mcp.jsonhas the right URL - Restart Cursor: Sometimes it needs a restart to pick up new MCP servers
- Check the logs: Look for error messages in the terminal where you started the server
Performance Issues
# Check database size
ls -lh knowledge_graph.db
# Clear cache and restart
rm -rf ~/.cache/cursor-kg/
cargo run --release
# Reduce memory usage in config.toml
[memory]
max_cache_size_mb = 64 # Default is 128
Common Errors
"Failed to bind to address" โ Port 8360 is already in use. Try a different port or kill the other process.
"Database is locked"
โ Another instance might be running. Check with ps aux | grep cursor-kg
"Model not found" โ The AI model is downloading. Wait a few minutes and try again.
๐๏ธ Architecture (For Developers)
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ Cursor IDE โโโโโบโ MCP Protocol โโโโโบโ cursor-kg โ
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ โผ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Graph Engine โ โ
โ โ โโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโ โ โ
โ โ โ Episodes โ Relationships โ โ โ
โ โ โ Entities โ Embeddings โ โ โ
โ โ โโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Storage Layer โ โ
โ โ โโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโ โ โ
โ โ โ SQLite โ Cache โ โ โ
โ โ โ FTS5 โ In-Memory โ โ โ
โ โ โโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Tech Stack:
- Rust - Fast, safe systems programming
- SQLite + FTS5 - Local database with full-text search
- ONNX Runtime - Local AI models (no internet required)
- MCP Protocol - Standard way to connect to editors
๐ Performance
This thing is fast. Here's why:
- Written in Rust - Compiled, not interpreted
- Local everything - No network calls to AI APIs
- Smart caching - Frequently used data stays in memory
- Efficient storage - SQLite with full-text search built-in
Real numbers:
- Memory: ~50MB baseline (grows with your data)
- Storage: ~2MB per 1000 conversations/episodes
- Speed: 10-40x faster than Python-based alternatives
๐ง Development
Want to contribute or modify the code? Here's how:
Project Structure
cursor-kg/
โโโ src/
โ โโโ main.rs # Server entry point
โ โโโ mcp/ # MCP protocol handling
โ โโโ graph/ # Knowledge graph logic
โ โโโ embeddings/ # AI model integration
โ โโโ search/ # Search functionality
โ โโโ security/ # Authentication & validation
โโโ config.toml # Configuration
โโโ tests/ # Test files
โโโ README.md # This file
Running Tests
# Run all tests
cargo test
# Run with output
cargo test -- --nocapture
# Run specific test
cargo test test_name
Making Changes
- Fork the repo on GitHub
- Make your changes in a new branch
- Test everything with
cargo test - Submit a pull request
Adding Features
- New MCP tools: Add to
src/mcp/handlers.rs - Database changes: Modify
src/graph/storage.rs - Configuration options: Update
config.tomlandsrc/config/mod.rs
๐ณ Docker (Optional)
If you prefer containers:
# Build and run
docker build -t cursor-kg .
docker run -p 8360:8360 cursor-kg
# Or use docker-compose
docker-compose up -d
๐ More Information
- Security: See SECURITY_AUDIT_REPORT.md for security features
- Configuration: See CONFIG_MIGRATION_GUIDE.md for detailed config options
- Development: Check out the other
.mdfiles for implementation details
๐ค Contributing
Found a bug? Want to add a feature? Contributions are welcome!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Acknowledgments
- Built with Rust for performance and safety
- Uses SQLite for reliable local storage
- Integrates with Cursor IDE via the MCP protocol
- AI embeddings powered by ONNX Runtime
Questions? Open an issue on GitHub or check the troubleshooting section above!
