Demo Owg
No description available
Ask AI about Demo Owg
Powered by Claude Β· Grounded in docs
I know everything about Demo Owg. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
MCP Agent Demo
A multi-step tool orchestration system using the Model Context Protocol (MCP), LangChain, and custom LLM integration (ChatMMC). The system automatically plans and executes multi-step workflows using available tools.
π Features
- Automatic Planning: LLM-powered multi-step execution planning
- Tool Orchestration: Sequential tool execution with dependency management
- Streaming API: Real-time Server-Sent Events (SSE) streaming
- Chat Interface: ChatGPT-like UI built with Streamlit
- Extensible: Easy to add new tools
π Prerequisites
- Python 3
- MMC API Key (for internal use) or compatible LLM API endpoint
π Installation
-
Clone the repository
cd /Users/ankit/Desktop/varsha_projects/mcp_demo_owg -
Install dependencies
pip install fastmcp langchain-openai python-dotenv uvicorn fastapi streamlit requests -
Set up environment variables
Copy the example environment file and update with your credentials:
cp .env.example .envEdit
.envwith your API credentials:# Organization LLM Configuration ORG_LLM_ENDPOINT=https://your-api-endpoint/chat/completions ORG_LLM_API_KEY=your-api-key-here ORG_LLM_MODEL=your-model-name ORG_LLM_BASE_URL=https://your-api-base-url # OpenAI Compatible Keys (used by ChatMMC) OPENAI_API_KEY=your-api-key-here OPENAI_MODEL=your-model-name
π― Quick Start
Option 1: Using Streamlit UI (Recommended)
-
Start the MCP Server (Terminal 1)
python server.pyServer runs at:
http://localhost:8080/mcp -
Start the API Server (Terminal 2)
python api.pyAPI runs at:
http://localhost:8000 -
Start the Streamlit App (Terminal 3)
streamlit run streamlit_app.pyUI opens at:
http://localhost:8501
Option 2: Using Python Client Directly
python client.py
Option 3: Using API with curl
curl -X POST http://localhost:8000/run_agent \
-H "Content-Type: application/json" \
-d '{"query": "add 5 and 8 then multiply by 6"}'
π Project Structure
mcp_demo_owg/
βββ server.py # MCP server (loads and registers tools)
βββ api.py # FastAPI server with SSE streaming
βββ client.py # MCPAgentOrchestrator class
βββ streamlit_app.py # Streamlit chat interface
βββ llm.py # ChatMMC class (custom LLM integration)
βββ tools/
β βββ generic_tools.py # Conversational tools
β βββ math_tools.py # Math operation tools
βββ .env # Environment variables (create from .env.example)
βββ .env.example # Example environment configuration
βββ README.md # This file
π€ Custom LLM Integration (ChatMMC)
The project uses a custom ChatMMC class in llm.py that integrates with your organization's LLM API endpoint. This class is designed as a drop-in replacement for ChatOpenAI from LangChain.
Features:
- Compatible with LangChain chains and pipelines
- Supports environment variable configuration
- Handles both dict and LangChain message formats
- Automatically loads credentials from
.envfile
Usage Example:
from llm import ChatMMC
# Initialize with defaults from .env
llm = ChatMMC()
# Or explicitly provide configuration
llm = ChatMMC(
api_key="your-api-key",
model="your-model-name",
temperature=0.7
)
# Use it like ChatOpenAI
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
response = llm.invoke(messages)
print(response)
Environment Variables:
The ChatMMC class reads from these environment variables in order of preference:
OPENAI_API_KEYorORG_LLM_API_KEY- Your API keyOPENAI_MODELorORG_LLM_MODEL- Model nameORG_LLM_BASE_URL- Base URL for the API endpoint
π οΈ Adding New Tools
Step 1: Create a Tools File
Create a new file in the tools/ directory (e.g., tools/my_custom_tools.py):
"""
My Custom Tools for MCP
-----------------------
"""
from fastmcp import FastMCP
def register_tools(mcp: FastMCP):
"""Register custom tools with the MCP server"""
@mcp.tool()
async def greet_user(name: str) -> str:
"""Greet a user by name"""
return f"Hello, {name}! Welcome to MCP Agent."
@mcp.tool()
async def calculate_square(number: float) -> float:
"""Calculate the square of a number"""
return number ** 2
@mcp.tool()
async def reverse_text(text: str) -> str:
"""Reverse a given text"""
return text[::-1]
print("β
Custom tools registered")
Step 2: Register the Tools Module
Update server.py to load your new tools:
def create_mcp_server() -> FastMCP:
"""Create a single MCP server and load all tool modules."""
mcp = FastMCP("IntegratedTools")
# Add your new module to this list
tool_modules = [
"generic_tools",
"my_custom_tools" # Add this line
]
for module_name in tool_modules:
# ... rest of the code
Step 3: Restart the Server
Restart the MCP server to load the new tools:
python server.py
Tool Function Requirements
- Must be
asyncfunctions - Use type hints for parameters
- Include docstrings (used by the planner)
- Decorate with
@mcp.tool() - Return serializable data (str, int, float, dict, list)
Example: Tool with API Integration
@mcp.tool()
async def fetch_weather(city: str, api_key: str) -> dict:
"""Fetch current weather for a city"""
import httpx
async with httpx.AsyncClient() as client:
response = await client.get(
f"https://api.weather.com/v1/current",
params={"city": city, "key": api_key}
)
return response.json()
π§ͺ Testing
Test with curl (SSE Stream)
curl -N -X POST http://localhost:8000/run_agent \
-H "Content-Type: application/json" \
-d '{
"query": "first add 5 and 8 then multiply by 6"
}'
Expected Output:
event: plan
data: {"plan": [{"tool": "add", "args": [5, 8]}, {"tool": "multiply", "args": ["PREVIOUS_RESULT", 6]}]}
event: step
data: {"step": 1, "tool": "add", "args": {"a": 5, "b": 8}}
event: step_result
data: {"step": 1, "result": 13}
event: step
data: {"step": 2, "tool": "multiply", "args": {"a": 13, "b": 6}}
event: step_result
data: {"step": 2, "result": 78}
event: final
data: {"result": 78}
event: done
data: {}
Test with Postman
-
Create a new POST request
- URL:
http://localhost:8000/run_agent - Headers:
Content-Type: application/json
- URL:
-
Request Body:
{ "query": "calculate (10 + 5) * 3" } -
View streaming response in the Postman console
Example Queries
# Math operations
curl -X POST http://localhost:8000/run_agent \
-H "Content-Type: application/json" \
-d '{"query": "what is 100 divided by 4?"}'
# Multi-step calculation
curl -X POST http://localhost:8000/run_agent \
-H "Content-Type: application/json" \
-d '{"query": "subtract 10 from 50, then multiply the result by 2"}'
# Using conversational tools
curl -X POST http://localhost:8000/run_agent \
-H "Content-Type: application/json" \
-d '{"query": "say hello to me"}'
π§ API Reference
POST /run_agent
Execute a multi-step agent workflow.
Request:
{
"query": "your natural language query here"
}
Response: Server-Sent Events (SSE) stream
Event Types:
plan- Execution plan generatedstep- Tool execution startedstep_result- Tool execution completedfinal- Final resulterror- Error occurreddone- Stream complete
ποΈ Architecture
βββββββββββββββββββ
β Streamlit UI β
β (Port 8501) β
ββββββββββ¬βββββββββ
β HTTP POST
βΌ
βββββββββββββββββββ
β FastAPI β
β (Port 8000) β ββββ SSE Stream
ββββββββββ¬βββββββββ
β
βΌ
βββββββββββββββββββ
β MCPAgent β
β Orchestrator β
ββββββββββ¬βββββββββ
β HTTP
βΌ
βββββββββββββββββββ
β MCP Server β
β (Port 8080) β
ββββββββββ¬βββββββββ
β
βΌ
βββββββββββββββββββ
β Tool Modules β
β (generic, etc) β
βββββββββββββββββββ
π Security
- API keys are automatically masked in UI (displays as
sk-...) - Sensitive parameters are filtered from streaming responses
- Environment variables for credential management
π Troubleshooting
Issue: "Invalid planner output"
Solution: The LLM is not generating valid JSON. Check your API key and model name in .env file. Ensure OPENAI_API_KEY and OPENAI_MODEL are correctly set.
Issue: "Connection refused" on port 8080
Solution: Make sure the MCP server is running (python server.py).
Issue: LLM API connection errors
Solution:
- Verify your API endpoint is accessible: Check
ORG_LLM_BASE_URLin.env - Confirm your API key is valid: Check
OPENAI_API_KEYin.env - Test the endpoint manually using curl to verify connectivity
Issue: "Module not found" error
Solution: Install missing dependencies:
pip install fastmcp langchain-openai python-dotenv uvicorn fastapi streamlit
π Available Tools (Default)
Math Tools
add(a, b)- Add two numberssubtract(a, b)- Subtract b from amultiply(a, b)- Multiply two numbersdivide(a, b)- Divide a by b
Conversational Tools
handle_greeting(text, openai_api_key)- Respond to greetings
π€ Contributing
To add new tool categories:
- Create a new file in
tools/directory - Implement
register_tools(mcp)function - Add module name to
tool_moduleslist inserver.py - Restart the MCP server
π License
MIT License
