Docker Starter
Containerized MCP server (Python/FastMCP) + Microsoft Agent Framework .NET client orchestrated with docker-compose.
Ask AI about Docker Starter
Powered by Claude Β· Grounded in docs
I know everything about Docker Starter. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
MCP Docker Starter β Agent Framework + MCP over Docker
Two containers. One agent conversation.
A Python MCP server and a Microsoft Agent Framework (.NET) client wired together with Docker Compose β showing how to compose agents with remote tools over real service-to-service networking.
ββββββββββββββββββββββββ Streamable HTTP /mcp ββββββββββββββββββββββββ
β agent-client (.NET) β ββββββββββββββββββββββΆ β mcp-server (Python) β
β Microsoft.Agents.AI β JSON-RPC β FastMCP β
β + Azure OpenAI β β list/add/complete β
ββββββββββββββββββββββββ ββββββββββββββββββββββββ
β β
ββββββββββββ mcp-net (bridge network) βββββββββββββ
Part of a Docker-first series for Microsoft Agent Framework:
agent-framework-devcontainerΒ·mcp-docker-starterΒ·ai-agents-compose-stack
What's interesting here (for Docker-curious readers)
| Pattern shown | Where to look |
|---|---|
| Polyglot compose (Python + .NET) | compose.yaml |
| Service-to-service via bridge network | networks: mcp-net |
| Service discovery by name | Client connects to http://mcp-server:8000/mcp |
| Non-root containers | Both Dockerfiles use dedicated system users |
| Healthcheck on the MCP server | mcp-server/Dockerfile |
| Multi-stage .NET build, Alpine runtime | agent-client/Dockerfile |
| Client readiness wait | WaitForEndpointAsync in Program.cs |
Secrets via .env, never committed | .gitignore + .dockerignore |
Requirements
- Docker Desktop (or Docker Engine + Compose v2)
- An Azure OpenAI resource with a chat deployment (e.g.
gpt-4o-mini)
Pull from GHCR (skip the build)
Both components ship as multi-arch images (linux/amd64 + linux/arm64) with SBOM and build provenance:
| Image | Package |
|---|---|
ghcr.io/ppiova/mcp-docker-starter/mcp-server:latest | view |
ghcr.io/ppiova/mcp-docker-starter/agent-client:latest | view |
Drop this compose.ghcr.yaml next to your .env and skip the local build entirely:
services:
mcp-server:
image: ghcr.io/ppiova/mcp-docker-starter/mcp-server:latest
ports: ["8000:8000"]
networks: [mcp-net]
agent-client:
image: ghcr.io/ppiova/mcp-docker-starter/agent-client:latest
depends_on: [mcp-server]
env_file: [.env]
environment:
MCP_SERVER_URL: "http://mcp-server:8000/mcp"
networks: [mcp-net]
stdin_open: true
tty: true
networks: { mcp-net: {} }
docker compose -f compose.ghcr.yaml up
Quickstart
git clone https://github.com/ppiova/mcp-docker-starter.git
cd mcp-docker-starter
cp .env.example .env
# edit .env with your Azure OpenAI values
docker compose up --build
You should see (abbreviated):
mcp-tasks-server | [tasks-mcp] SSE listening on http://0.0.0.0:8000/mcp
mcp-agent-client | β
MCP host reachable at http://mcp-server:8000/
mcp-agent-client | π Connecting to MCP: http://mcp-server:8000/mcp
mcp-agent-client | π§° MCP tools discovered: list_tasks, add_task, complete_task, stats
mcp-agent-client | > Prompt: Mostrame el estado actual de tareas...
mcp-agent-client | --- Respuesta (streaming) ---
mcp-agent-client | Tareas abiertas:
mcp-agent-client | 1. Review Agent Framework samples (high)
mcp-agent-client | 2. Write blog post about MCP (medium)
mcp-agent-client | ...
Ask your own question
docker compose run --rm agent-client "CompletΓ‘ la tarea 1 y mostrame las stats"
The MCP server
mcp-server/server.py uses FastMCP (Python) to expose 4 tools:
| Tool | Args | Returns |
|---|---|---|
list_tasks | status? = 'open' | 'done' | array of tasks |
add_task | title: str, priority: 'low'|'medium'|'high' | new task |
complete_task | id: int | updated task |
stats | β | counters + priority breakdown |
State is in-memory on purpose β it's a starter. Swap for Redis/SQL when you need persistence (add a redis service to compose.yaml).
Debug the MCP server directly
It's exposed on localhost:8000 for convenience. Use the official MCP Inspector:
npx @modelcontextprotocol/inspector
# Then connect to http://localhost:8000/mcp (transport: streamable-http)
Note on DNS-rebinding protection β the Python MCP SDK rejects unknown Host headers by default. Inside Docker Compose the client reaches the server via the service name (
mcp-server:8000), so we explicitly allow it viaTransportSecuritySettings(allowed_hosts=...). If you deploy behind a reverse proxy / ingress, add the extra hostnames through theMCP_ALLOWED_HOSTSenv var (comma-separated).
The Agent Framework client
agent-client/Program.cs (C#, .NET 8) does four things:
- Waits for the MCP host to be reachable (compose
depends_on+ app-level readiness check). - Opens an SSE MCP connection via
ModelContextProtocolclient SDK. - Lists tools from the MCP server and casts them into
AITools. - Creates an
AIAgentviaMicrosoft.Agents.AIbound to Azure OpenAI, with the MCP tools attached.
The model decides which MCP tools to call based on the prompt. No mock, no handwritten wrappers β MCP tools flow straight into the Agent Framework.
Extending this starter
- Add persistence β swap the in-memory dict for Redis/SQLite and add a
redis/dbservice. - Add more MCP tools β decorate with
@mcp.tool(), they're auto-discovered by the client. - Scale the client β run N agent-client replicas against a single MCP server.
- Production hardening β put the MCP server behind a reverse proxy, enable auth, ship traces via OpenTelemetry (see
ai-agents-compose-stack). - Swap models β point the client at OpenAI.com or Ollama using
Microsoft.Extensions.AIproviders.
Project layout
.
βββ .devcontainer/
β βββ devcontainer.json
βββ agent-client/
β βββ Dockerfile # multi-stage, Alpine runtime, non-root
β βββ AgentClient.csproj # .NET 8 + Microsoft.Agents.AI + ModelContextProtocol
β βββ Program.cs # Connects to MCP, creates agent, streams response
βββ mcp-server/
β βββ Dockerfile # Python 3.12-slim, non-root, healthcheck
β βββ requirements.txt
β βββ server.py # FastMCP + 4 tools + SSE transport
βββ compose.yaml # Two services on a private bridge network
βββ .dockerignore
βββ .env.example
βββ .gitignore
βββ LICENSE
βββ README.md
Auth modes
Same pattern as the other repos in this series:
AZURE_OPENAI_API_KEYif set β key auth (simplest inside Docker).- Otherwise β
AzureCliCredential(works in the Dev Container afteraz login). - Otherwise β
DefaultAzureCredential(Managed Identity / env vars in production).
License
MIT β by Pablo Piovano Β· Microsoft MVP in AI.
