Fundamentals
Intro to Model Context Protocol (MCP). Learn to build MCP servers and clients with spring-ai and the Python SDK using tools, resources, and prompts. Covers architecture, transport-agnostic messaging, request-response flow, server testing with Inspector, document tools, resource handling, prompts, and practical integration patterns with Claude.
Ask AI about Fundamentals
Powered by Claude Β· Grounded in docs
I know everything about Fundamentals. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
MCP Document Chatbot β Spring AI
Branch: without-mcp/spring-ai-tool-annotation
β οΈ This branch does NOT use a real MCP Server. It is intentionally built that way β the goal is to teach you what MCP replaces before you learn MCP itself. See the
with-mcp/spring-ai-mcp-clientbranch for the real MCP implementation.
This is a CLI chatbot that simulates MCP-style tool calling using Spring AI's @Tool annotation.
The LLM can read and edit local documents, and users can use @filename syntax to
automatically inject document content as context β all running inside a single JVM process,
with no separate MCP Server involved.
What you'll learn from this branch:
- How tools are exposed to an LLM (the concept behind MCP Tools)
- How document context gets injected into a prompt (the concept behind MCP Resources)
- How Spring AI manages the agentic tool-call loop automatically
- Why a standard protocol like MCP becomes necessary at scale
Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β CLI (Terminal) β
β User types: "Summarise @meeting-notes.md" β
βββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
β
βββββββββββββΌβββββββββββββββ
β DocumentContextInjector β β scans for @mentions
β β’ finds @meeting-notes.mdβ reads file content
β β’ injects content inline β appends to message
βββββββββββββ¬βββββββββββββββ
β
βββββββββββββΌβββββββββββββββ
β ChatClient β β Spring AI
β β’ System prompt β
β β’ Conversation memory β
β β’ Tool schemas attached β
βββββββββββββ¬βββββββββββββββ
β HTTP (JSON)
βββββββββββββΌβββββββββββββββ
β OpenAI / LLM β β decides what to do
β β’ Reads injected context β
β β’ OR calls tool to read β
β β’ OR calls tool to edit β
βββββββββββββ¬βββββββββββββββ
β tool_call response
βββββββββββββΌβββββββββββββββ
β DocumentTools β β MCP tools layer
β β’ listDocuments() β Spring AI executes
β β’ readDocument(name) β tool calls, feeds
β β’ editDocument(name,txt) β results back to LLM
β β’ createDocument(name,β¦) β
ββββββββββββββββββββββββββββ
β
./docs/*.md / *.txt β local filesystem
Two Ways Documents Reach the LLM
| Method | When | How |
|---|---|---|
| @mention injection | User tags a file: @todo.txt | File content added directly to user message before LLM call |
| Tool call | LLM decides it needs a file | LLM emits readDocument("todo.txt") β Spring AI runs it β result fed back |
The @mention approach is faster (one LLM round-trip). Tool calls give the LLM agency to fetch what it needs dynamically.
Quick Start
1. Prerequisites
- Java 21+
- Maven 3.9+
- OpenAI API key (
gpt-4o-miniis used by default)
2. Clone & configure
# Set your OpenAI key
export OPENAI_API_KEY=sk-...
# (optional) point to your own docs folder
export DOCS_DIRECTORY=./my-docs
3. Run
cd mcp-doc-chatbot
mvn spring-boot:run
On first run, three sample documents are created automatically in ./docs/:
meeting-notes.mdtodo.txtproject-notes.md
Usage
you> /docs # list available documents
you> Summarise @meeting-notes.md # auto-inject context + summarise
you> What tasks are left in @todo.txt? # check status
you> Edit @project-notes.md β add a "Deployment" section with k8s steps
you> Create a new document called retro.md with a sprint retrospective template
you> What do both @meeting-notes.md and @todo.txt say about the API?
Slash Commands
| Command | Description |
|---|---|
/docs | List documents (local, no LLM call) |
/clear | Clear terminal |
/help | Show help |
/exit | Quit |
Key Concepts Demonstrated
1. MCP Tools (DocumentTools.java)
Each @Tool-annotated method is an MCP tool:
- Spring AI serializes the method signature into a JSON schema
- The schema is sent to the LLM with every request
- The LLM can invoke any tool at any reasoning step
- Spring AI intercepts the tool call, runs the Java method, feeds the result back
This is equivalent to what an MCP Server exposes β the difference is in a real
MCP setup the tools live in a separate process with a standardized protocol.
Spring AI @Tool is MCP-in-process.
2. Context Injection (DocumentContextInjector.java)
The @mention pattern mimics how tools like Cursor inject file content via MCP Resources.
Instead of waiting for the LLM to call readDocument, we push the content proactively.
3. Conversation Memory
MessageChatMemoryAdvisor keeps the full conversation history in memory and
automatically appends it to each request. The LLM can refer back to earlier messages.
Swapping the LLM Provider
Because Spring AI abstracts the model layer, swapping providers is a one-line change in pom.xml:
<!-- OpenAI (default) -->
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
<!-- Anthropic Claude -->
<artifactId>spring-ai-anthropic-spring-boot-starter</artifactId>
<!-- Local (Ollama) -->
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
Documentation
Update application.yml with the corresponding config key, and you're done.
No changes to DocumentTools, CliChatRunner, or any business logic.
Project Structure
mcp-doc-chatbot/
βββ pom.xml
βββ docs/ # document store (auto-created)
β βββ meeting-notes.md
β βββ todo.txt
β βββ project-notes.md
βββ src/main/
βββ java/com/example/mcpchat/
β βββ McpChatbotApplication.java # Spring Boot entry point
β βββ DocumentTools.java # MCP tools (@Tool methods)
β βββ DocumentContextInjector.java # @mention β context injection
β βββ CliChatRunner.java # CLI loop + ChatClient setup
βββ resources/
βββ application.yml
