Vobase
The app framework built for AI coding agents. Own every line. Your AI already knows how to build on it.
Ask AI about Vobase
Powered by Claude Β· Grounded in docs
I know everything about Vobase. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
English / δΈζ
vobase
The app framework built for AI coding agents.
Own every line. Your AI already knows how to build on it.
what you get Β· get started Β· code Β· harness Β· compare Β· docs
A full-stack TypeScript framework that gives you auth, database, storage, jobs, and a first-class AI agent runtime in a single Bun process. Docker Compose Postgres for local dev, managed Postgres in production. Like a self-hosted Supabase β but you own every line of code. Like Pocketbase β but it's TypeScript you can read and modify.
AI coding agents (Claude Code, Cursor, Codex) understand vobase out of the box. Strict conventions and a uniform module shape mean generated code works on the first try β not the third.
You own the code. You own the data. You own the infrastructure.
what you get
One bun create vobase and you have a working full-stack app:
| Primitive | What it does |
|---|---|
| Runtime | Bun β native TypeScript, ~50ms startup, built-in test runner. One process, one container. |
| Database | PostgreSQL via Drizzle. Docker Compose Postgres (pgvector/pg17) for local dev, managed Postgres in production. Full SQL, ACID transactions, pgvector for embeddings. |
| Auth | better-auth. Sessions, passwords, email OTP, CSRF. RBAC with role guards, API keys, organizations. SSO/2FA as plugins. |
| API | Hono β ~14KB, typed routing, Bun-first. Every AI coding tool already knows Hono. |
| Audit | Built-in audit log, record change tracking, and auth event hooks. Every mutation is traceable. |
| Sequences | Gap-free business number generation (INV-0001, PO-0042). Transaction-safe, never skips. |
| Storage | File storage with virtual buckets. Local or S3/R2 backends. Metadata tracked in Postgres. |
| Channels | Multi-channel messaging with pluggable adapters: WhatsApp (Cloud API), email (Resend, SMTP). Inbound webhooks, outbound sends, delivery tracking. All messages logged. |
| Integrations | Encrypted credential vault for external services. AES-256-GCM at rest. Platform-aware: opt-in multi-tenant OAuth handoff via HMAC-signed JWT. |
| Jobs | Background tasks with retries, cron, and job chains. pg-boss backed β Postgres only, no Redis. |
| Realtime | Server-push via PostgreSQL LISTEN/NOTIFY + SSE. No WebSocket. Modules pg_notify after commit; the frontend hook invalidates matching TanStack Query keys. |
| Agent harness | First-class AI agent runtime (pi-agent-core + pi-ai). Frozen system prompt per wake, byte-stable provider cache, tool budget spill, steer/abort between turns, journaled events, idle resumption, restart recovery. |
| Workspace | Virtual filesystem materialized per-wake from your modules. AGENTS.md is composed from per-module fragments; agents read /staff/<id>/profile.md, /contacts/<id>/MEMORY.md, etc. RO enforcement at the FS boundary. |
| CLI | @vobase/cli β standalone, catalog-driven binary. Modules register verbs via defineCliVerb; the same body runs in-process (agent bash sandbox) and over HTTP-RPC (vobase binary). |
| Frontend | React + TanStack Router + shadcn/ui + ai-elements + DiceUI + Tailwind v4. Type-safe routing with codegen, code-splitting. You own the component source. |
| MCP | Model Context Protocol server in the same process. AI tools can read your schema, list modules, and view logs before generating code. |
| Deploy | Dockerfile + railway.json included. One railway up or docker build and you're live. |
Locally, docker compose up -d starts a pgvector/pg17 Postgres instance. bun run dev and you're building. In production, point DATABASE_URL at any managed Postgres.
quick start
bun create vobase my-app
cd my-app
docker compose up -d
bun run db:reset
bun run dev
Backend on :3001, frontend on :5173. Ships with the agent-native helpdesk template β messaging, channels, contacts, team, drive, agents β already wired up.
what you can build
Every module is a self-contained directory: schema, service, handlers, jobs, pages, and an agent.ts slot that publishes tools, materializers, RO hints, and AGENTS.md fragments to the harness. No plugins, no marketplace. Just TypeScript you own.
| Use Case | What Ships |
|---|---|
| Agent-native helpdesk | The default template. WhatsApp + email inbox, contact memory, staff-mention fan-out, supervisor coaching, scheduled follow-ups, approval gates, drive overlays. |
| SaaS Starter | User accounts, billing integration, subscription management. Auth + jobs + webhooks handle the plumbing. |
| Internal Tools | Admin panels, operations dashboards, approval workflows. Status machines enforce business logic. Audit trails track every change. |
| CRM & Contacts | Companies, contacts, interaction timelines, deal tracking. Cross-module references via service imports β no FK across module boundaries. |
| Project Tracker | Tasks, assignments, status workflows, notifications. Background jobs handle reminders and escalations. |
| Billing & Invoicing | Invoices, line items, payments, aging reports. Integer money ensures exact arithmetic. Gap-free numbering via transactions. |
| Your Vertical | Property management, fleet tracking, field services β whatever the business needs. Describe it to your AI tool. It generates the module. |
AI coding agents generate modules from your conventions. Like npx shadcn add button β files get copied, you own the code.
how it works
Vobase makes itself legible to every AI coding tool on the market.
The framework ships with one canonical module shape, one write-path discipline, and a harness that AI agents drive at runtime. When you need a new capability:
- Open your AI tool and describe the requirement
- The AI reads your existing schema, the canonical module shape, and the relevant
.claude/skills/packs - It generates a complete module β schema, service, handlers, jobs, pages, agent slot, tests, seed data
- You review the diff, run
bun run dev, and it works
Skill packs cover the parts where apps get tricky: money stored as integer cents (never floats), status transitions as explicit state machines (not arbitrary string updates), gap-free business numbers generated inside database transactions, single-write-path enforcement via check:shape, frontend bundle isolation via check:bundle.
These conventions are what make AI-generated modules work on the first try.
The thesis: your specs and domain knowledge are the asset. AI tools are the compiler. The compiler improves every quarter. Your skills compound forever.
what a module looks like
Every module is a thin aggregator over sibling files. module.ts declares the contract; everything else lives next to the code that owns the side-effect.
// modules/projects/module.ts
import type { ModuleDef } from '~/runtime'
import { projectsAgent } from './agent'
import { projectListVerb } from './verbs/project-list'
import { createProjectsService, installProjectsService } from './service/projects'
import * as web from './web'
const projects: ModuleDef = {
name: 'projects',
requires: ['team'],
web: { routes: web.routes },
jobs: [],
agent: projectsAgent,
init(ctx) {
installProjectsService(createProjectsService({ db: ctx.db }))
ctx.cli.registerAll([projectListVerb])
},
}
export default projects
modules/projects/
module.ts β thin aggregator (above)
schema.ts β Drizzle table definitions
state.ts β status transitions, state machine
service/ β transactional write-path (sole writer of this module's tables)
handlers/ β Hono routes (HTTP API)
web.ts β route barrel mounted under /api/projects
pages/ β React pages β list, detail, create
components/ β React components owned by this module
hooks/ β TanStack Query hooks
jobs.ts β pg-boss handlers
agent.ts β agent slot: tools, materializers, roHints, AGENTS.md fragments
tools/ β defineAgentTool β colocated with the service
verbs/ β defineCliVerb β runs in agent bash and the CLI binary
cli.ts β barrel exporting <module>Verbs
seed.ts β demo data
defaults/ β *.agent.yaml, *.schedule.yaml β opt-in starter content
skills/ β inline skill bodies the agent reads via drive overlay
*.test.ts β colocated bun test
schema example β Drizzle + PostgreSQL with typed columns, timestamps, status enums
// modules/projects/schema.ts
import { pgTable, text, integer, timestamp, check } from 'drizzle-orm/pg-core'
import { sql } from 'drizzle-orm'
import { nanoidPrimaryKey } from '@vobase/core'
export const projects = pgTable('projects', {
id: nanoidPrimaryKey(),
name: text('name').notNull(),
description: text('description'),
status: text('status').notNull().default('active'),
ownerId: text('owner_id').notNull(),
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
}, (t) => [
check('projects_status_chk', sql`${t.status} in ('active','archived','deleted')`),
])
export const tasks = pgTable('tasks', {
id: nanoidPrimaryKey(),
projectId: text('project_id').references(() => projects.id),
title: text('title').notNull(),
status: text('status').notNull().default('todo'),
assigneeId: text('assignee_id'),
priority: integer('priority').notNull().default(0),
}, (t) => [
check('tasks_status_chk', sql`${t.status} in ('todo','in_progress','done')`),
])
check:shape enforces that only service/projects.ts writes to projects β handlers and jobs go through the service.
handler example β Hono routes with Zod validation, typed RPC client
// modules/projects/handlers/list.ts
import { Hono } from 'hono'
import { zValidator } from '@hono/zod-validator'
import { z } from 'zod'
import { getCtx } from '~/runtime'
import { projectsService } from '../service/projects'
export const listRoute = new Hono().get(
'/',
zValidator('query', z.object({ status: z.enum(['active','archived']).optional() })),
async (c) => {
const ctx = getCtx(c)
const { status } = c.req.valid('query')
const items = await projectsService().list({ ownerId: ctx.user.id, status })
return c.json(items)
},
)
The frontend gets fully typed API calls via the Hono RPC client (src/lib/api-client.ts):
import { useQuery } from '@tanstack/react-query'
import { api } from '@/lib/api-client'
export function useProjects() {
return useQuery({
queryKey: ['projects'],
queryFn: async () => {
const res = await api.projects.$get({ query: {} })
return await res.json() // fully typed
},
})
}
use-realtime-invalidation.ts maps pg_notify table payloads onto the first element of TanStack queryKey β services emit notifies after commit and the UI re-fetches automatically.
job example β background tasks via pg-boss, no Redis
// modules/projects/jobs.ts
import { defineJob } from '@vobase/core'
import { projectsService } from './service/projects'
export const sendReminder = defineJob('projects:send-reminder',
async (data: { taskId: string }) => {
await projectsService().notifyAssignee(data.taskId)
},
)
Schedule from handlers or services: ctx.scheduler.add('projects:send-reminder', { taskId }, { delay: '1d' }). Retries, cron scheduling, and priority queues β all Postgres-backed via pg-boss.
agent slot example β tools, materializers, AGENTS.md fragments
// modules/projects/agent.ts
import { defineAgentTool, defineIndexContributor } from '@vobase/core'
import { projectsService } from './service/projects'
const createTask = defineAgentTool({
name: 'create_task',
audience: 'internal',
lane: 'standalone',
// schema: zod input/outputβ¦
async handler({ input, ctx }) {
return await projectsService().createTask(input)
},
})
export const projectsAgent = {
agentsMd: [defineIndexContributor({
file: 'AGENTS.md',
priority: 50,
name: 'projects.overview',
render: () => '## Projects\n\n- `create_task` to add a task to a project.',
})],
materializers: [/* WorkspaceMaterializerFactory<WakeContext>[] */],
roHints: [/* explain why /projects/<id>/* paths are read-only */],
tools: [createTask],
}
The wake builder filters tools by lane and audience, runs each materializer factory against the wake context, chains roHints, and feeds the AGENTS.md contributors into the harness. One agent slot per module β no central registry to update.
the ctx object
Every HTTP handler gets a context object with runtime capabilities. Current surface:
| Property | What it does |
|---|---|
ctx.db | Drizzle instance. Full PostgreSQL β reads, writes, transactions. |
ctx.user | { id, email, name, role, activeOrganizationId? }. From better-auth session. RBAC middlewares: requireRole(), requirePermission(), requireOrg(). |
ctx.scheduler | Job queue. add(jobName, data, options) to schedule background work. |
ctx.storage | StorageService β virtual buckets with local/S3/R2 backends. |
ctx.channels | ChannelsService β email and WhatsApp sends. All messages logged. |
ctx.integrations | Encrypted credential vault. ctx.integrations.getActive(provider) returns decrypted config or null. |
ctx.http | Typed HTTP client with retries, timeouts, and circuit breakers. |
ctx.realtime | RealtimeService β notify({ table, id?, action? }, tx?) after mutations. SSE subscribers receive the event; the frontend hook invalidates matching TanStack queries. |
Modules can declare an init(ctx: ModuleInitCtx) hook that runs at boot with { db, realtime, jobs, scheduler, auth, cli }. Cross-module callers import from @modules/<name>/service/* directly β no port shim, no plugin system. Unconfigured services use throw-proxies that produce descriptive errors if accessed.
App-level config:
// vobase.config.ts
export default defineConfig({
database: process.env.DATABASE_URL,
integrations: { enabled: true }, // opt-in: encrypted credential store
storage: { // opt-in: file storage
provider: { type: 'local', basePath: './data/files' },
buckets: { avatars: { maxSize: 5_000_000 }, documents: {} },
},
channels: { // opt-in: email + WhatsApp
email: { provider: 'resend', from: 'noreply@example.com', resend: { apiKey: '...' } },
},
http: {
timeout: 10_000,
retries: 3,
circuitBreaker: { threshold: 5, resetTimeout: 30_000 },
},
webhooks: {
'stripe-events': {
path: '/webhooks/stripe',
secret: process.env.STRIPE_WEBHOOK_SECRET,
handler: 'system:processWebhook',
signatureHeader: 'stripe-signature',
dedup: true,
},
},
})
Credentials stay in .env. Config declares the shape.
agent harness
The harness is the AI runtime in core. It runs on top of @mariozechner/pi-agent-core + @mariozechner/pi-ai and ships as createHarness({...}) from @vobase/core. Each "wake" is one bounded run of an agent over a frozen system prompt.
Lanes β the template ships two:
- Conversation β bound to
(contactId, channelInstanceId, conversationId). Triggered byinbound_message,supervisor,approval_resumed,scheduled_followup,manual. - Standalone β operator threads + heartbeats. Triggered by
operator_thread,heartbeat. Customer-facing tools are filtered out.
Invariants baked into core:
- Frozen snapshot. System prompt computed once at
agent_start; thesystemHashis identical every turn so the provider's prefix cache stays warm. Mid-wake writes surface in the next turn's side-load. - Steer/abort between turns. Customer messages append to the steer queue and drain after
tool_execution_end. Supervisor and approval-resumed events hard-abort and re-wake. - Tool stdout budget. 4KB inline β 100KB spill (
/tmp/tool-<callId>.txt) β 200KB turn ceiling. - Idle resumption + restart recovery. The harness recovers orphaned dispatches on boot and resumes idle wakes via journaled events.
- Cost cap. Daily-spend tracking + per-org evaluation gate.
Workspace β every wake runs against a virtual filesystem materialized from your modules. AGENTS.md is composed from each module's agentsMd contributor (plus per-tool guidance). Read-only paths are enforced at the FS boundary via ScopedFs. Memory writes (/contacts/<id>/MEMORY.md, /agents/<id>/MEMORY.md, /staff/<id>/MEMORY.md) flush at turn end.
LLM provider β one seam. Bifrost when BIFROST_API_KEY + BIFROST_URL are set, otherwise direct OpenAI / Anthropic / Google. Use createModel(alias) from the template's ~/wake; never hardcode a provider-prefixed id.
Testing β pass streamFn: stubStreamFn([...]) (inline AssistantMessageEvent[] per LLM call) to bootWake to keep tests off real providers. Live smoke tests under tests/smoke/ exercise real keys.
vs the alternatives
| Vobase | Supabase | Pocketbase | Rails / Laravel | |
|---|---|---|---|---|
| What you get | Full-stack scaffold (backend + frontend + agent harness + skills) | Backend-as-a-service (db + auth + storage + functions) | Backend binary (db + auth + storage + API) | Full-stack framework |
| Language | TypeScript end-to-end | TypeScript (client) + PostgreSQL | Go (closed binary) | Ruby / PHP |
| Database | PostgreSQL (Docker Compose local, managed prod) | PostgreSQL (managed) | SQLite (embedded) | PostgreSQL / MySQL |
| Self-hosted | One process, one container | 10+ Docker containers | One binary | Multi-process |
| You own the code | Yes β all source in your project | No β managed service | No β compiled binary | Yes β but no AI conventions |
| AI agent runtime | First-class harness (frozen prompts, tool budget, steer/abort) | Edge functions only | None | None |
| AI integration | Skills + MCP + canonical module shape | None | None | None |
| How you customize | Edit the code. AI reads it. | Dashboard + RLS policies | Admin UI + hooks | Edit the code |
| Hosting cost | As low as $15/mo | $25/mo+ (or complex self-host) | Free (self-host) | Varies |
| Data isolation | Physical (one db per app) | Logical (RLS) | Physical | Varies |
| License | MIT | Apache 2.0 | MIT | MIT |
vs Supabase: Self-hosted Supabase is 10+ Docker containers. RLS policies are hard to reason about. You don't own the backend code. Vobase is one process, you own every line β AI agents can read and modify everything.
vs Pocketbase: Pocketbase is a Go binary. You can see the admin UI, but you can't read or modify the internals. When you need custom business logic, you're writing Go plugins or calling external services. Vobase is TypeScript you own β AI agents understand and extend it natively.
vs Rails / Laravel: Great frameworks, but they weren't designed for AI coding agents. Vobase's canonical module shape and skill packs mean AI-generated code follows your patterns consistently. Plus: simpler stack (no Redis, single process, TypeScript end-to-end).
runtime architecture
One Bun process. One Docker container. One app.
Docker container (--restart=always)
βββ Bun process (PID 1)
βββ Hono server
β βββ /api/auth/* β better-auth (sessions, OTP, CSRF)
β βββ /api/<mod>/* β module web routes (session-validated)
β βββ /api/cli/* β CLI catalog + dispatch (HTTP-RPC)
β βββ /mcp β MCP server (same process, shared port)
β βββ /webhooks/* β inbound channel webhooks (signature verified, dedup)
β βββ /api/realtime β SSE stream (LISTEN/NOTIFY β client)
β βββ /* β frontend (static, from dist/)
βββ Drizzle (postgres-js β PostgreSQL)
βββ Built-in modules (in @vobase/core)
β βββ _auth β better-auth behind AuthAdapter contract
β βββ _audit β audit log, record tracking, auth hooks
β βββ _sequences β gap-free business number counters
β βββ _integrations β encrypted credential vault, platform OAuth handoff (opt-in)
β βββ _storage β virtual buckets, local/S3/R2 (opt-in)
β βββ _channels β unified messaging, adapter pattern (opt-in)
βββ Template modules (in @vobase/template)
β βββ settings β contacts β team β drive β messaging
β βββ agents β schedules β channels β changes β system
β βββ wake/ β agent harness seam (conversation + standalone lanes)
βββ pg-boss (Postgres-backed job queue, pg-boss own schema)
βββ Outbound HTTP (typed fetch, retries, circuit breakers)
βββ Audit middleware (all mutations β audit_log)
mcp server
Runs in the same Bun process on the same port. Authenticated via API keys (better-auth apiKey plugin). When you connect Claude Code, Codex, Cursor, or any MCP-compatible tool, it sees your app:
| Tool | What it does |
|---|---|
list_modules | List all registered modules (built-in + user) |
read_module | Read table names from a specific module schema |
get_schema | List all table names across every module |
view_logs | Return recent audit log entries |
The AI sees your exact data model, your existing modules, and the conventions before it writes a single line of code.
deployment
Ship a Docker image. Railway, Fly.io, or any Docker host. Set DATABASE_URL for a managed Postgres connection.
Railway (quickest):
railway up
The template ships with Dockerfile and railway.json pre-configured. Add a Postgres plugin and Railway sets DATABASE_URL automatically.
Docker Compose:
# docker-compose.yml
services:
vobase:
image: your-registry/my-vobase:latest
restart: always
environment:
DATABASE_URL: postgres://user:pass@db:5432/vobase
ports:
- "3000:3000"
db:
image: pgvector/pgvector:pg17
volumes:
- pgdata:/var/lib/postgresql/data
environment:
POSTGRES_DB: vobase
POSTGRES_USER: user
POSTGRES_PASSWORD: pass
volumes:
pgdata:
project commands
After scaffolding, your project uses standard tools directly β no wrapper CLI:
| Command | What it does |
|---|---|
docker compose up -d | Start local Postgres (pgvector/pg17, port 5432). |
bun run dev | Bun backend with --hot and Vite frontend, both via concurrently. |
bun run db:push | Push schema to database (dev). |
bun run db:generate | Generate migration files for production. |
bun run db:migrate | Run migrations against the database. |
bun run db:seed | Seed default admin user and sample data. |
bun run db:reset | Nuke + push + seed. |
bun run db:studio | Drizzle Studio for visual database browsing. |
bun run check | Run every check:* (shape, bundle, no-auto-nav-tabs, shadcn-overrides). |
bun run test | Full test suite. test:e2e / test:smoke for live integration. |
project structure
my-app/
.env
.env.example
package.json β depends on @vobase/core
docker-compose.yml β local Postgres (pgvector/pg17)
drizzle.config.ts
vite.config.ts
index.html
main.ts β ~10-line Bun.serve entry
CLAUDE.md β project context and guardrails
AGENTS.md β agent guardrails (mirrors CLAUDE.md)
.claude/
skills/ β skill packs the AI reads when generating code
auth/ β better-auth + plugins
runtime/
index.ts β cross-module primitives, ModuleDef/ModuleInitCtx
bootstrap.ts β createApp(), worker registration
modules.ts β static list of modules
wake/ β agent harness seam (top-level)
conversation.ts β conversation lane builder
standalone.ts β standalone lane builder
inbound.ts β channels:inbound-to-wake handler
supervisor.ts β messaging:supervisor-to-wake handler
operator-thread.ts β agents:operator-thread-to-wake handler
heartbeat.ts β schedules cron-tick callback
llm.ts β Bifrost / direct provider seam
trigger.ts β WakeTriggerKind registry
workspace/ β per-wake virtual FS materializers
observers/ β workspace-sync, journal, etc.
modules/
settings/ β notification prefs, per-user UI state
contacts/ β customer records + /contacts/<id>/MEMORY.md
team/ β staff directory + attributes
drive/ β virtual filesystem; modules register overlays
messaging/ β conversations, messages, notes, supervisor fan-out
agents/ β definitions, learned skills, staff memory, scores
schedules/ β agent_schedules + cron heartbeats
channels/ β umbrella for adapters/{web,whatsapp,...}
changes/ β generic propose/decide/apply/history
system/ β ops dashboard, dev helpers
<each module>/
module.ts β thin aggregator
schema.ts
state.ts
service/ β sole writer of this module's tables
handlers/
web.ts
pages/ β React pages (TanStack file-based routes)
components/
hooks/
jobs.ts
agent.ts β tools, materializers, roHints, AGENTS.md fragments
tools/ β defineAgentTool
verbs/ β defineCliVerb
cli.ts β <module>Verbs barrel
seed.ts
defaults/ β *.agent.yaml, *.schedule.yaml
skills/ β inline skill bodies
src/ β frontend shell only
main.tsx
routeTree.gen.ts β generated TanStack route tree
lib/
api-client.ts β Hono RPC client
components/
ui/ β shadcn/ui (owned by you)
ai-elements/ β AI chat UI components (owned by you)
data-table/ β DiceUI data-table components
shell/
app-layout.tsx β main app shell with sidebar
command-palette.tsx
auth/
settings/
hooks/
styles/
stores/
tests/
e2e/ β real Postgres
smoke/ β live server, real LLM key
data/
files/ β optional, created on first upload
Star History
Star if the repo has helped you
license
MIT. Own everything.
