Basic FastMCP Server
Basic implementation of MCP via FastMCP Server. Server exposes a greet(name) tool over the MCP HTTP endpoint, and then the client uses OpenAI's Responses API with an MCP tool connection (via ngrok public URL) so the model can discover the tool and call it to generate a personalized greeting.
Ask AI about Basic FastMCP Server
Powered by Claude · Grounded in docs
I know everything about Basic FastMCP Server. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
FastMCP + OpenAI MCP Tool Demo (Greeting Server)
A minimal example showing how to expose a tool via FastMCP and call it using OpenAI’s MCP tool integration.
The MCP server runs locally, is exposed publicly via ngrok, and the OpenAI client connects to the MCP endpoint to discover tools and invoke them (tool calling).
What This Project Does
- Starts a FastMCP server that exposes a single tool:
greet(name: str) -> str - Uses ngrok to create a public URL for the local server
- Runs an OpenAI client script that:
- connects to the MCP server URL
- retrieves the tool list
- calls the tool through OpenAI tool calling
- prints the greeting result
Requirements
- Python 3.12 recommended
- An OpenAI API key
- ngrok installed and logged in
Setup
1) Create and activate a virtual environment (Windows / PowerShell)
py -3.12 -m venv .venv
.\.venv\Scripts\Activate.ps1
python -m pip install --upgrade pip setuptools wheel
#Dependencies
python -m pip install -U fastmcp openai python-dotenv
