ToolBridge MCP Server
MCP (Model Context Protocol) server exposing tool endpoints via stdio + HTTP. Tested with official MCP Inspector, containerized with Docker, deployed live to Azure App Service.
Ask AI about ToolBridge MCP Server
Powered by Claude Β· Grounded in docs
I know everything about ToolBridge MCP Server. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Project 3 β ToolBridge: an MCP Server deployed to the cloud
A minimal Model Context Protocol (MCP) server exposing two tools (get_weather, list_files) plus an HTTP FastAPI wrapper for easy demos. Dockerfile included for containerization; deployed live to Azure App Service on the free tier.
Live demo: https://toolbridge-bhavana-2026.azurewebsites.net/docs
What is MCP?
MCP (Model Context Protocol) is an open standard from Anthropic for connecting LLMs to tools, data sources, and prompts through a common protocol β like USB-C for AI. Any MCP-compatible client (Claude, Cursor, your own LangChain agent) can talk to any MCP server.
Why enterprises care: instead of writing a bespoke integration for every AI assistant, expose internal systems once as an MCP server β authentication, tool schemas, and versioning all handled via the protocol.
Tools exposed
| Tool | Input | Returns |
|---|---|---|
get_weather | {city: str} | mock weather payload |
list_files | {directory: str} | list of files in an allowlisted directory |
Tech stack
mcpSDK (Python)- FastAPI (HTTP wrapper for demos)
- Docker for packaging
- Azure App Service / AWS App Runner for hosting
Run the MCP server locally (Windows)
py -3.11 -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt
python mcp_server.py :: runs silently over stdio β Ctrl+C to stop
Test with MCP Inspector (the official Anthropic tool)
npx @modelcontextprotocol/inspector python mcp_server.py
Opens an interactive UI at http://localhost:6274 where you can list, call, and inspect responses from your tools.
Run the HTTP demo locally
uvicorn http_bridge:app --reload
Open http://localhost:8000/docs. Try:
POST /tools/get_weatherwith{"city": "Mumbai"}POST /tools/list_fileswith{"directory": "."}POST /tools/list_fileswith{"directory": "C:\\Windows"}β returns 403 (allowlist guard)
Deploy to Azure App Service (Free tier)
The exact commands used to ship this to a public URL:
az login
az webapp up --runtime "PYTHON:3.11" --name <your-unique-name> --sku F1 --location "centralindia" --resource-group toolbridge-rg
az webapp config set --resource-group toolbridge-rg --name <your-unique-name> --startup-file "uvicorn http_bridge:app --host 0.0.0.0 --port 8000"
az webapp restart --resource-group toolbridge-rg --name <your-unique-name>
Your app will be live at https://<your-unique-name>.azurewebsites.net/docs.
Why this stack
- MCP β standardizes how AI clients discover + call tools, avoiding bespoke integrations
- FastAPI HTTP wrapper β same tools exposed as REST for non-MCP clients (demos, Postman, Swagger)
- Docker β reproducible packaging; Dockerfile included for anyone wanting to run in a container
- Azure App Service F1 β free-tier PaaS with native Python runtime, no OS patching, auto-SSL
What I learned
- MCP's transport modes (stdio for local, SSE for remote) and when to use each
- Tool schema design β JSON Schema
inputSchemamatters for how clients discover parameters - Implementing an allowlist security guard so the AI can't arbitrarily read the filesystem
- The practical difference between Azure's zip-deploy (native Python runtime) and container-deploy (Docker image)
- Why "native services independent of VM" matters β no OS patching, auto-scaling, managed HTTPS out of the box
