Agentic Bi Demo
An end-to-end agentic architecture combining LLM agents, MCP tool integration, Micro Integrator flows, and API Manager exposure. Supports order placement, updates, cancellations, refunds, product search, and invoice creation backed by PostgreSQL.
Ask AI about Agentic Bi Demo
Powered by Claude Β· Grounded in docs
I know everything about Agentic Bi Demo. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
π§ AgenticBI β Unified LLM-Driven E-Commerce Automation
A full agentic architecture powered by Ballerina, WSO2 Integrator, WSO2 API Manager.
This project implements a complete multi-agent generative system for handling:
- Placing Orders
- Updating Orders
- Cancelling + Refunding Orders
- Routing Messages Across Specialized Agents
- Safely calling backend tools using MCP (Model Context Protocol)
- Storing and retrieving orders through a PostgreSQL microservice
- Exposing all functionality through API Manager and an MCP Server
It includes:
/agentsβ 4 LLM Agents (Router, PlaceOrder, UpdateOrder, CancelRefund)/backendβ MCP Tool Runtime (Product Search, Order DB, Invoice Creation, Refundsβ¦)/facadeβ WSO2 MI deployment that exposes backend tools/docker-compose.ymlβ Infra stack: PostgreSQL, Prometheus, Grafana, Jaeger/init.sqlβ DB initialization schema
π Folder Structure
AgenticBI/
βββ agents/ # All LLM agents + tooling integration
βββ backend/ # Ballerina MCP Tools service (DB, invoices, products)
βββ facade/ # WSO2 Micro Integrator deployment
βββ monitoring/ # Prometheus configuration
βββ docker-compose.yml
βββ start_all.sh # Script to start the entire environment
βββ stop_all.sh # Script to stop the entire environment
βββ init.sql # DB bootstrap schema for Postgres TMPFS instance
βββ README.md
π§ Requirements
Before running anything, ensure you have:
| Component | Version | Notes |
|---|---|---|
| Ballerina | 2201.9.0+ | Required for building agents + backend |
| WSO2 Micro Integrator | β₯ 4.5.0 | Needed to expose backend tools |
| WSO2 API Manager | β₯ 4.6.0 | Required to expose facade as API + MCP Server |
| Docker / Docker Compose | Latest | For Postgres + Observability stack |
| OpenAI Account | GPT-4o access | Required for agents |
| MacOS/Linux | Recommended | Script targets Unix shells |
π Configuration Required Before First Run
1. Edit /agents/Config.toml
This file contains your OpenAI credentials, backend endpoints, tenant IDs, etc.
YOU MUST UPDATE/INCLUDE THESE FIELDS:
MCP_ENVIRONMENT = "SANDBOX"
MCP_SANDBOX_TOKEN_URL = "" MCP_SANDBOX_CLIENT_ID = "" MCP_SANDBOX_CLIENT_SECRET = ""
For now, mirror sandbox URLs for prod; in a real setup this will be a different host.
MCP_PROD_TOKEN_URL = "" MCP_PROD_CLIENT_ID = "" MCP_PROD_CLIENT_SECRET = ""
Optional; leave empty if a scope is not required
MCP_OAUTH_SCOPE = ""
MCP_SERVER_URL = "" OPENAI_API_KEY=""
[ballerina.observe]
metricsEnabled = true
metricsReporter = "prometheus"
tracingEnabled = true
tracingProvider = "jaeger"
[ballerinax.prometheus]
port = 9797
host = "0.0.0.0"
[ballerinax.jaeger]
agentHostname = "jaeger"
agentPort = 4317
samplerType = "const"
samplerParam = 1.0
reporterFlushInterval = 2000
reporterBufferSize = 1000
If this is not configured correctly:
- Agents will fail to run
- Tools wonβt connect
- Router will return errors
- LLM calls may not execute due to missing credentials
2. Expose Backend Tools via WSO2 Micro Integrator (Facade)
The project includes fully prepared WSO2 MI artifacts under:
facade/src/main/java/wso2mi/artifacts
You have two supported ways to expose the backend tools from Micro Integrator to API Manager:
Option A β Expose as a Service Catalogue API (Recommended for MI β APIM integration) Option B β Import the OpenAPI Definition Manually into APIM
3. Publish the MI API as an MCP Server in API Manager
β Step-by-step:
4.1 Deploy MI API into APIM
In API Manager Publisher:
- Create MCP Server β point to your API
- Configure
- Publish the MCP Server.
π Starting the Entire System
Enter start-all.sh and adjust the folder paths to your local configuration
One command:
./start_all.sh
This launches:
| Component | Run mode |
|---|---|
| Agents | Local Ballerina |
| Backend Tools | Local Ballerina |
| WSO2 MI | Local server |
| WSO2 API Manager | Local server |
| Postgres | Docker |
| Prometheus | Docker |
| Grafana | Docker |
| Jaeger | Docker |
Check logs inside:
logs/
π Stopping Everything
./stop_all.sh
This gracefully terminates:
- Agents
- Backend
- MI
- API Manager
- Docker services
π§ͺ Testing the System
Before running any cURL tests, you must first expose the unified Agent API (from /agents) as an AI API in WSO2 API Manager.
π Step 0 β Publish the Unified Agent API as an AI API in APIM
(Optional but recommended) Apply policies:
AI governance / prompt protection
Custom mediation policies
Save β Publish the API.
Go to DevPortal, subscribe, generate tokens, and test the API.
π Observability
Prometheus
http://localhost:9090
Grafana
http://localhost:3000
# Default login:
admin / admin
Jaeger (Distributed Tracing)
http://localhost:16686
π Known Constraints
- The database runs on tmpfs, meaning all orders reset when Postgres restarts.
- You MUST expose the MI backend API through APIM before the agents can call tools.
Config.tomlmust be updated before first run.- Sessions stick across messages; use
"new subject"to reset context routing.
