Kafka Orders AI
Kafka KStreams Interactive Query exposed as MCP server powered by Quarkus
Ask AI about Kafka Orders AI
Powered by Claude · Grounded in docs
I know everything about Kafka Orders AI. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
Kafka Orders AI
Run locally
Prerequisites
Required software:
- JDK 21 (Java Development Kit)
- Maven Wrapper (automatically downloaded by the provided scripts)
- Podman (for container management)
Authenticate Podman to download the container base image:
podman login registry.redhat.io
TIP: Here information for Registry Authentication
Mac OS users reported missing dependencies in order-query project.
In case add this snippet to order-query/pom.xml:
<dependency>
<groupId>org.rocksdb</groupId>
<artifactId>rocksdbjni</artifactId>
<version>7.9.2.redhat-00006</version>
<scope>compile</scope>
</dependency>
Run Quarkus projects in dev mode
Run the order producer:
./mvnw -f order-producer quarkus:dev
In different terminal, run the order query:
./mvnw -f order-query quarkus:dev -Ddebug=5006 -Dquarkus.http.port=8090
Run LibreChat
LibreChat is an open-source, self-hosted chat platform designed to provide a ChatGPT-like experience with enhanced privacy, flexibility, and extensibility. It allows users to interact with various AI models and services, including OpenAI, Google, and custom endpoints, all within a customizable web interface. LibreChat supports features such as multi-user authentication, social logins, conversation management, and integration with external tools and APIs.
Here the instuctions to build and run a local container:
-
Clone librechat project:
git clone https://github.com/danny-avila/LibreChat.git cd LibreChat -
This project has been tested with
v0.8.3-rc3git checkout v0.8.0-rc3 -
Build image using the
Containerfileprovided in this project:podman build -t librechat:v0.8.0-rc3 -f ${PATH_TO_KAFKA_ORDERS_AI}/librechat/Containerfile . -
Configure LibreChat
-
Navigate to the
/librechatdirectory in this projectcd ${PATH_TO_KAFKA_ORDERS_AI}/librechat -
Create a
librechat-env.yamlfile to store your API token(s). For example:apiVersion: v1 kind: ConfigMap metadata: name: librechat-env data: MISTRAL_API_KEY: a...........................zTIP: You can generate your API key by signing up for Model as a Service (MaaS) at: https://maas.apps.prod.rhoai.rh-aiservices-bu.com/. LibreChat also supports other models, such as OpenAI: simply provide the relevant API key in your configuration.
-
Make sure that the configuration file (
librechat.yaml) include the address of the local Quarkus MCP Server.mcpServers: kafka: url: http://host.docker.internal:8090/mcp/sse timeout: 60000
-
-
Launch the LibreChat pod:
podman kube play --configmap librechat-env.yaml podman-kube-play.yaml
Build the Order Agent in LibreChat UI
-
Access the LibreChat user interface by navigating to http://localhost:3080/ in your web browser.
-
Complete the sign-up process to create your LibreChat user account.
-
Log in with your new credentials and accept the terms and conditions when prompted.
-
In the right panel of the LibreChat UI, configure your agent as follows:
-
Name:
Kafka Agent -
Model:
mistral-small-24b-w8a8 -
Instructions:
you will provide order information using available tools. When the result contains more than 2 items, format the response in table. -
Click on
Add Toolsto open the tools selection dialog. Locate theorders_tooltile, click theAdd +button, and then close the dialog.
-
Chat with the agent
-
In the top navigation bar, click on the default model (
gpt-4o-mini) and selectMy agent > Kafka agentfrom the dropdown.
-
Before starting a conversation, inject some new orders by visiting http://localhost:8080/. For best results, try adding 15 orders in 3 separate rounds, waiting a few seconds between each batch.
-
Now you can chat with your agent. Here are some example prompts you can try:
show last 10 ordersshow last 4 orders and compute the average amountshow orders aggregated by client over the last 10 minuteshow many orders come from Stark industries in the last 10 minutes
Note: Due to the inherent unpredictability of large language models (LLMs), responses may occasionally be inaccurate or incomplete.
Deploy Quarkus projects on OpenShift
Prerequisites
The Quarkus applications relies on a pre-existing Kafka server installation.
The actual Kafka bootstrap address has to be configured here: k8s/base/configmap.yaml
Deployment
Create project:
oc new-project kafka-orders-ai
Set shared configuration and pvc:
oc apply -k k8s
Deploy the order producer:
./mvnw -f order-producer package -DskipTests -Dquarkus.kubernetes.deploy=true
Deploy the order query:
./mvnw -f order-query package -DskipTests -Dquarkus.kubernetes.deploy=true
Remove all projects artifacts from OpenShift
oc delete all -l app.kubernetes.io/part-of=kafka-orders-ai
LibreChat Deployment
Create project:
oc new-project librechat
Clone LibreChat repo
Copy the Container file in the cloned repository.
From the LibreChat folder, build the image locally:
podman build --tag "librechat:local" --file Containerfile
Push on the registry and set the local lookup policy:
set REGISTRY (oc registry info)
podman login -u (oc whoami) -p (oc whoami --show-token) $REGISTRY
podman tag localhost/librechat:local $REGISTRY/librechat/librechat:latest
podman push $REGISTRY/librechat/librechat:latest
oc patch imagestream librechat --type merge -p '{"spec":{"lookupPolicy":{"local":true}}}'
Update the configuration file (librechat.yaml) to match the service address of the Quarkus MCP Server, e.g.:
mcpServers:
kafka:
url: http://order-query.kafka-orders-ai.svc/mcp/sse
timeout: 60000
Deploy manifests via Kustomizer:
oc apply -k librechat
