π¦
Basic
A Model Context Protocol server
0 installs
Trust: 37 β Low
Other
Ask AI about Basic
Powered by Claude Β· Grounded in docs
I know everything about Basic. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Loading tools...
Reviews
Documentation
MCP Server Langchain Project
A demonstration project showcasing the integration of Model Context Protocol (MCP) servers with Langchain and LangGraph for creating AI agents that can interact with custom tools.
Project Overview
This project demonstrates how to:
- Create MCP servers with custom tools (Math and Weather services)
- Connect multiple MCP servers to a Langchain client
- Use LangGraph to create reactive agents that can utilize MCP tools
- Integrate with different LLM providers (Groq, Gemini)
Project Structure
MCPSERVERLangchain-main/
βββ server/
β βββ mathserver.py # MCP server with math operations
β βββ weather.py # MCP server with weather information
βββ client/
β βββ client.py # Langchain client connecting to MCP servers
βββ main.py # Main entry point
βββ pyproject.toml # Project dependencies
βββ requirements.txt # Python requirements
βββ README.md # This file
Features
MCP Servers
-
Math Server (
server/mathserver.py)- Provides mathematical operations
- Tools:
add(a, b)andmultiple(a, b) - Transport: stdio
-
Weather Server (
server/weather.py)- Provides weather information
- Tools:
get_weather(location) - Transport: streamable-http
Client
The client (client/client.py) demonstrates:
- Multi-server MCP client setup
- Integration with LangGraph reactive agents
- Tool invocation through natural language queries
Prerequisites
- Python 3.13 or higher
- API keys for LLM providers (Groq, Gemini)
Installation
- Clone the repository:
git clone <repository-url>
cd MCPSERVERLangchain-main
- Install dependencies:
pip install -r requirements.txt
- Create a
.envfile in the project root with your API keys:
GROQ_API_KEY=your_groq_api_key_here
GEMINI_API_KEY=your_gemini_api_key_here
Usage
Running the MCP Servers
- Start the Weather Server (in one terminal):
cd server
python weather.py
- Start the Math Server (in another terminal):
cd server
python mathserver.py
Running the Client
- Run the client (in a third terminal):
cd client
python client.py
The client will:
- Connect to both MCP servers
- Create a reactive agent using the specified LLM
- Execute example queries for math and weather operations
Configuration
Environment Variables
GROQ_API_KEY: Your Groq API key for LLM accessGEMINI_API_KEY: Your Gemini API key for LLM access
Server Configuration
- Math Server: Uses stdio transport for direct communication
- Weather Server: Uses streamable-http transport on localhost:8000
Client Configuration
The client is configured to connect to:
- Math server via stdio transport
- Weather server via HTTP transport at
http://localhost:8000/mcp
Example Queries
The client demonstrates these example queries:
- Math Query: "what's (3 + 5) x 12?"
- Weather Query: "what is the weather in California?"
Dependencies
langchain-groq: Groq LLM integrationlangchain-mcp-adapters: MCP adapters for Langchainlanggraph: Reactive agent frameworkmcp: Model Context Protocol implementationpython-dotenv: Environment variable management
Troubleshooting
- Port Conflicts: Ensure port 8000 is available for the weather server
- API Keys: Verify your API keys are correctly set in the
.envfile - Python Version: Ensure you're using Python 3.13 or higher
- Dependencies: Run
pip install -r requirements.txtif you encounter import errors
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
License
This project is licensed under the terms specified in the LICENSE file.
