Appearance
AI Module Overview
The AI module implements the backend building blocks for configurable AI agents: agent CRUD, chat history, streamed assistant responses, knowledge-base ingestion with RAG, model catalog synchronization, and per-org token usage tracking.
Current Runtime Status
- The module code exists under
apps/api/src/modules/ai, butAppModuledoes not currently importAIModule. AIModuleonly registers theaiTypeORM connection today. Its controller and provider arrays are still empty, so the controller-defined route surface is not live yet.- The docs in this section describe the implemented module behavior in source, plus the current readiness gaps that still prevent end-to-end activation.
Responsibilities
- Manage AI agent definitions, including brain, instruction, knowledge, and limit configurations.
- Persist chats and messages, then stream assistant responses through the Vercel AI SDK gateway.
- Run internal subagents for guardrail checks and first-message title generation.
- Index knowledge documents into Qdrant and expose a RAG tool that agents can call during execution.
- Mirror the Vercel AI Gateway model catalog into local persistence and track sync status.
- Record detailed token and cost usage logs, then aggregate monthly organization summaries.
Architecture
The module follows the same layered CQRS structure used by the rest of apps/api:
- Presentation:
AgentController,ChatController,MessageController,KnowledgeController, andModelControllerdefine the HTTP surface. - Application: command handlers orchestrate agent CRUD, chat lifecycle, streamed execution, knowledge ingestion, and model refresh jobs.
- Infrastructure: TypeORM persistence, Qdrant, Vercel AI Gateway, embeddings, Redis-backed chat caches, and internal subagents implement the runtime integrations.
- Domain:
Agent,Chat,Message,AgentKnowledgeDocument,Model,TokenUsageLog, andOrgTokenUsageSummarycapture the core business state.