Skip to main content

Architecture

This page provides a high-level overview of Kaji's platform architecture. It's intended for technical stakeholders who want to understand how the system is structured.


System overview

Kaji is a multi-component platform deployed on the Shakudo Kubernetes infrastructure. At a high level:

  1. Users interact through KajiChat, a real-time web application
  2. KajiChat sends prompts to KajiCore, the AI agent runtime
  3. KajiCore executes tasks using tools, skills, and integrations
  4. Platform services provide routing, event processing, and persistence
  Users


KajiChat (Web UI) ←── Real-time chat, SSE, LLM proxy


Kaji Event Bus ←── Routes triggers, manages KajiCore lifecycle


KajiCore (Agent Runtime) ←── LLM loop, tools, memory, skills

├──→ LLM Router ←── Classifies prompts, selects model tier
├──→ AI Gateway ←── Routes to OpenAI, Anthropic, Gemini, etc.
├──→ MCP Servers ←── Tool integrations (ClickUp, Notion, etc.)
├──→ FalkorDB ←── Memory graph (entities, facts, episodes)
└──→ SeaweedFS ←── File storage and session persistence

Core components

KajiChat

The user-facing web application. Built with Rust (Actix-Web) and Dioxus (WASM frontend).

ResponsibilityDetail
Chat interfaceReal-time messaging with SSE (Server-Sent Events)
LLM proxyStreams responses from OpenAI, Anthropic, and Gemini
File attachmentsUpload, share, and reference files in conversations
Labels and filtersOrganize and find conversations
AutoKaji managementCreate, pause, and monitor scheduled automations
Usage guideBuilt-in guide showing prompt patterns and capabilities

KajiCore

The AI agent runtime — the brain of the platform. Each active conversation maps to one KajiCore process (1:1).

ResponsibilityDetail
Agent loopMulti-turn LLM conversation with tool dispatch
Tool executionRun bash commands, read/write/edit files, search, ask questions
MCP integrationJSON-RPC transport to external tool servers
Skill injectionDiscover and load specialized instruction sets
MemoryFalkorDB-backed knowledge graph with entities and relationships
CompactionSummarizes long conversations to preserve context
PersistenceSQLite with Litestream WAL replication to SeaweedFS
Circuit breakersDetects loops and consecutive errors gracefully

Kaji Event Bus

Bridges user triggers into KajiCore lifecycle management via NATS JetStream.

ResponsibilityDetail
Trigger routingRoutes @kaji mentions and prompt triggers to the right KajiCore process
Process lifecycleSpawns and manages KajiCore pods on demand
Event normalizationMaps raw events into structured chat events
NATS integrationPublishes to kaji.events.* and kaji.chat.* streams

Kaji LLM Router

A stateless proxy that classifies prompt complexity and routes to the appropriate model tier.

ResponsibilityDetail
Prompt classification3-tier signal detection (low / medium / high complexity)
Model selectionRoutes to the best-fit model via virtual model names
FallbackAutomatic retry through the AI Gateway's priority-based backend refs

Supporting infrastructure

ComponentPurpose
NATS JetStreamEvent bus for real-time message routing between components
PostgreSQLKajiChat database — chats, messages, participants, attachments, sessions
FalkorDBGraph database for KajiCore's memory system
SeaweedFSObject storage for file attachments, session persistence, and learned skills
AI Gateway (Envoy)LLM backend routing with priority-based fallback across providers
KeycloakSSO authentication for KajiChat users

Data flow example

Here's how a typical user prompt flows through the system:

  1. User types a message in KajiChat web UI
  2. KajiChat persists the message to PostgreSQL and publishes a trigger event to NATS
  3. Kaji Event Bus receives the trigger and ensures a KajiCore process is running for that chat
  4. KajiCore receives the prompt, sends it to the LLM RouterAI Gateway → model provider
  5. The model responds, KajiCore may execute tools (bash, file ops, MCP calls)
  6. KajiCore streams results back through NATS → KajiChat → user's browser via SSE
  7. Memory facts and session state are persisted to FalkorDB and SeaweedFS

Deployment

Kaji runs on Shakudo's Kubernetes platform as microservices:

ServiceNamespacePort
KajiChathyperplane-kaji3000
Kaji Event Bushyperplane-kaji— (NATS consumer)
KajiCore podshyperplane-kaji9091 (health/metrics)
Kaji LLM Routerhyperplane-pipelines— (proxy)

Next steps