Chat Boat
Gemini , Ollama, Openrouter with MCP Server
Ask AI about Chat Boat
Powered by Claude · Grounded in docs
I know everything about Chat Boat. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Chat Boat Application
Overview
The Chat Boat application is a Spring Boot-based chatbot that leverages various Large Language Models (LLMs) to provide conversational AI capabilities. It supports multiple LLM providers, including Ollama, OpenRouter, and Gemini, allowing users to interact with the chatbot using their preferred LLM.
Technologies Used
- Spring Boot
- Java
- Maven
- LLMs (Ollama, OpenRouter, Gemini)
- WebSocket (for real-time communication)
- JJWT (JSON Web Token) (optional, for extending with authentication)
- MySQL
Project Structure
pom.xml: Maven configuration file containing project dependencies and build settings.src/main/java/com/hotelai/ChatBoatApplication.java: Main application class and entry point for the Spring Boot application.src/main/java/com/hotelai/config/: Configuration classes, includingLlmProperties.javafor LLM provider configurations.src/main/java/com/hotelai/controller/: Controller classes, includingChatController.javafor handling chat-related API endpoints.src/main/java/com/hotelai/service/: Service classes, includingLlmService.javafor handling LLM interactions.src/main/resources/: Resources directory containing configuration files and static assets.application.yml: Main configuration file for setting LLM providers and other application properties.application.properties: Alternative configuration file using.propertiesformat.static/index.html: Static HTML file for the chatbot interface.
Setup Instructions
-
Prerequisites:
- Java Development Kit (JDK) version 17 or higher
- Maven
- MySQL database
- (Optional) Ollama or accounts with OpenRouter and Gemini, if you want to use those LLMs.
-
Configuration:
- Configure the database connection in
src/main/resources/application.properties. - Configure the LLM providers in
src/main/resources/application.yml. You'll need API keys for OpenRouter and Gemini if you want to use those. Ollama is configured to run locally.
- Configure the database connection in
-
Build:
mvn clean install -
Run:
mvn spring-boot:runAlternatively, you can run the packaged jar file:
java -jar target/chat_boat-0.0.1-SNAPSHOT.jar -
Access the Chatbot: Open your web browser and navigate to
http://localhost:8080/.
API Documentation
Chat Endpoints
-
GET /chat?message={message}
- Uses the default LLM provider specified in
application.yml. - Example:
http://localhost:8080/chat?message=Hello
- Uses the default LLM provider specified in
-
GET /chat/provider?provider={provider}&message={message}
- Forces a particular LLM provider for this call.
- Supported providers:
ollama,openrouter,gemini - Example:
http://localhost:8080/chat/provider?provider=ollama&message=Hello
LLM Provider Configuration
The application.yml file contains the configuration for the LLM providers. Here's an example:
Extending the Application
- Authentication: The project includes the JJWT library, which can be used to add authentication to the API endpoints.
- More LLM Providers: You can add support for more LLM providers by implementing new methods in the
LlmService.javafile. - Database Integration: The project includes Spring Data JPA, which can be used to store chat history and other data in the MySQL database.
