JC MCP Tutorial
No description available
Ask AI about JC MCP Tutorial
Powered by Claude Β· Grounded in docs
I know everything about JC MCP Tutorial. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
MCP Agents Educational Project
This is an educational, minimal project that demonstrates the Model Context Protocol (MCP) with three different types of servers:
- Code Server - File operations and code execution
- Database Server - SQLite database operations
- Document Server - Document management and search
What is MCP?
The Model Context Protocol (MCP) is a standardized protocol that allows AI agents to interact with various services and tools. It provides a common interface for:
- Tools: Functions that agents can call to perform actions
- Resources: Data that agents can access
- Prompts: Reusable prompt templates
Project Structure
.
βββ servers/
β βββ code_server.py # Code server implementation
β βββ database_server.py # Database server implementation
β βββ document_server.py # Document server implementation
βββ notebooks/
β βββ mcp_client_demo.ipynb # Jupyter notebook with basic MCP demos
β βββ mcp_llm_agent.ipynb # Jupyter notebook with LLM integration
βββ requirements.txt # Python dependencies
βββ README.md # This file
Setup
-
Install dependencies:
pip install -r requirements.txt -
Make server scripts executable (optional):
chmod +x servers/*.py
Running the Demo
-
Start Jupyter Notebook:
jupyter notebook -
Open the demo notebooks (IN ORDER):
- START HERE:
notebooks/mcp_client_demo.ipynb- Basic MCP demo- Run cells one at a time
- Wait for each cell to complete before running the next
- THEN TRY:
notebooks/mcp_llm_agent.ipynb- Advanced LLM integration- Requires understanding of the basic demo first
- START HERE:
-
Important Notes:
- Run cells sequentially: Don't run all cells at once
- Wait for completion: Look for
[1],[2],[3](cell numbers) [*]means running: Wait for it to finishawaitworks in Jupyter: No need forasyncio.run()- Servers start automatically: No need to manually start them
-
Troubleshooting:
- If execution numbers don't show: See
NOTEBOOK_TROUBLESHOOTING.md - If cells hang: Press
Ctrl+C(orCmd+C) to interrupt, then restart kernel - If connection errors: See
GETTING_STARTED.mdfor detailed help
- If execution numbers don't show: See
LLM Integration
The project includes LLM integration to create intelligent agents that can understand natural language and use MCP tools. Two methods are supported:
Method 1: Ollama (Local Models)
Run LLMs locally without API keys:
- Install Ollama: Download from https://ollama.ai
- Pull a model:
ollama pull llama3(or any other model) - Start Ollama service:
ollama serve - Use in notebook: The
mcp_llm_agent.ipynbnotebook will automatically detect and use Ollama
Method 2: API-based Models (OpenAI & Gemini)
Use cloud-based LLM APIs:
-
Get API Keys:
- OpenAI: Get from https://platform.openai.com/api-keys
- Gemini: Get from https://aistudio.google.com/app/apikey
-
Set API Keys (use .env file):
- Create
.envfile in the project root:cp .env.example .env - Edit
.envfile and add your keys:OPENAI_API_KEY=your-openai-key-here GEMINI_API_KEY=your-gemini-key-here - Note: You only need to set the keys for providers you want to use
- The
.envfile is gitignored for security
- Create
-
Use in notebook: Open
mcp_llm_agent.ipynb- keys will be loaded automatically from.env
Example LLM Agent Usage
# Using Ollama (local)
result = await intelligent_agent(
"List all users in the database",
provider="ollama",
model="llama3"
)
# Using OpenAI
result = await intelligent_agent(
"Create a document with database statistics",
provider="openai",
model="gpt-4o-mini"
)
# Using Gemini
result = await intelligent_agent(
"Write a Python function to calculate factorial",
provider="gemini"
)
Server Details
Code Server (code_server.py)
Provides tools for:
read_file: Read file contentswrite_file: Write content to filesexecute_code: Execute Python codesave_code_snippet: Save code snippets for later uselist_code_snippets: List all saved code snippets
Resources:
- Code snippets stored in memory (accessible via
code://URIs)
Database Server (database_server.py)
Provides tools for:
execute_query: Execute SQL queries (SELECT, INSERT, UPDATE, DELETE)list_tables: List all database tablesdescribe_table: Get schema information for a tableinsert_user: Insert a new user (convenience method)get_user: Get a user by email (convenience method)
Resources:
- Database schema (
db://schema) - List of tables (
db://tables)
Note: The server automatically creates an SQLite database (example.db) with sample data on first run.
Document Server (document_server.py)
Provides tools for:
create_document: Create a new text documentread_document: Read document contentslist_documents: List all available documentssearch_documents: Search for text across all documentsappend_to_document: Append text to an existing documentdelete_document: Delete a document
Resources:
- Documents stored in
documents/directory (accessible viadoc://URIs)
How MCP Works
Transport Mechanisms
MCP supports multiple transport mechanisms:
-
stdio (Standard Input/Output) - Used in this project
- β Simple: No network configuration needed
- β Secure: No exposed ports
- β Process management: Client spawns server
- β Local only: Cannot connect to remote servers
- β One client per server: Each client spawns its own process
-
HTTP/SSE (Server-Sent Events) - Alternative option
- β Remote access: Connect to servers on different machines
- β Scalable: Multiple clients can share one server
- β Web integration: Can be accessed from browsers
- β More complex: Requires web server framework
- β Security: Needs authentication/authorization
Why stdio in this project?
- Educational focus: Simpler to understand and set up
- Local development: Perfect for learning and testing
- MCP standard: Primary transport mechanism in MCP spec
- No dependencies: Works out-of-the-box
When to use HTTP instead:
- Production deployments
- Remote server access
- Multiple clients sharing one server
- Web browser integration
See TRANSPORT_COMPARISON.md for detailed comparison.
Communication Flow
-
Server: An MCP server exposes tools and resources via stdio (or HTTP)
-
Client: An MCP client connects to the server and can:
- List available tools
- Call tools with arguments
- List available resources
- Access resources by URI
-
Communication: Uses JSON-RPC over stdio (or HTTP/SSE) for communication
Example Usage
Basic MCP Demo (mcp_client_demo.ipynb)
The basic notebook demonstrates:
- Connecting to each server
- Listing available tools
- Calling tools with different parameters
- Accessing resources
- Combining multiple servers in a workflow
LLM Agent Demo (mcp_llm_agent.ipynb)
The LLM agent notebook demonstrates:
- Using Ollama for local LLM inference
- Using OpenAI API for cloud-based LLMs
- Using Google Gemini API
- Natural language interaction with MCP servers
- Intelligent tool selection and execution
- Multi-step workflow automation
Educational Value
This project illustrates:
- How to create MCP servers
- How to implement tools and resources
- How to connect clients to servers
- How to use multiple servers together
- How to integrate LLMs with MCP servers
- How to create intelligent agents that understand natural language
- Real-world patterns for agentic applications
Transport Options
Current Implementation: stdio
The servers in this project use stdio transport, which is:
- Simple to set up
- Secure (no network ports)
- Perfect for local development
- The standard MCP transport
Adding HTTP Support
To use HTTP transport instead:
-
Install HTTP dependencies (optional):
pip install fastapi uvicorn -
Run server as HTTP service:
python servers/code_server_http.py -
Connect client via HTTP:
from mcp.client.sse import sse_client # Use HTTP client instead of stdio_client
See TRANSPORT_COMPARISON.md for detailed comparison of stdio vs HTTP.
Extending the Project
You can extend this project by:
- Adding more tools to existing servers
- Creating new servers (e.g., API server, file system server)
- Implementing HTTP transport for remote access
- Adding authentication and authorization
- Implementing more complex workflows
- Adding error handling and validation
- Creating a web interface for the servers
Resources
License
This is an educational project. Feel free to use and modify as needed.
