Chat App
Features & Capabilities
Chat App - Features & Capabilities Report
Template Version: 1.0.0 Generated: 2026-01-22 Package:
@epicdm/flowstate-app-chatApp ID:chat
Table of Contents
- Overview
- App Identity
- Feature Categories
- Capabilities
- Data Models
- User Workflows
- Integration Points
- Architecture Summary
- Quality & Compliance
Overview
Purpose
The Chat App is a multi-provider LLM chat interface for FlowState. It enables users to have real-time streaming conversations with various AI models including Anthropic Claude, OpenAI GPT, and local models via LM Studio. Conversations are persistently stored in RxDB with organization and workspace scoping, enabling team collaboration and personal conversation management.
Target Users
| User Type | Description | Primary Use Cases |
|---|---|---|
| Developers | Engineers needing AI assistance | Code assistance, debugging, API exploration, tool execution |
| Knowledge Workers | Business professionals | Document drafting, research, Q&A with persistent history |
| Teams | Collaborative groups | Shared workspace conversations, organization context |
| Power Users | Advanced users | Local model integration, custom system prompts |
Value Proposition
- Multi-Provider Support: Switch between Anthropic, OpenAI, and LM Studio seamlessly
- Real-Time Streaming: See responses token-by-token as they generate
- Persistent History: Never lose a conversation with RxDB-backed storage
- MCP Tool Integration: Execute tools directly from conversations
- Markdown Excellence: Full GFM support with syntax-highlighted code blocks
- Flexible Configuration: Per-conversation model and settings selection
App Identity
| Property | Value |
|---|---|
| ID | chat |
| Display Name | Chat |
| Package Name | @epicdm/flowstate-app-chat |
| Version | 0.1.0 |
| Category | business |
| Icon | message-square |
| Color | #06B6D4 |
| Base Path | /chat |
| Permissions | database |
Entry Points
| Entry Point | Path | Description |
|---|---|---|
| Main Export | src/index.ts | Primary package export |
| Standalone | src/standalone.tsx | Standalone app entry |
| App Component | src/App.tsx | Main React component with routing |
| Plugin | src/plugin.ts | FlowState plugin registration |
Feature Categories
Category 1: Conversation Management
Core conversation lifecycle and organization
Feature: Conversation CRUD Operations
| Property | Details |
|---|---|
| ID | conversation-crud |
| Status | Implemented |
| Priority | High |
Description: Create, view, update, and delete conversations with provider/model selection, custom settings, and metadata tracking.
User Story: As a user, I want to manage my conversations so that I can organize my chat history effectively.
Acceptance Criteria:
- [x] Create new conversations with default settings
- [x] View conversations in sidebar list sorted by most recent
- [x] Update conversation title inline
- [x] Delete conversations with cascade message deletion
- [x] Switch between conversations instantly
- [x] Track message count and last activity
Implementation:
| Component | Path | Purpose |
|---|---|---|
| Sidebar | src/components/Sidebar.tsx | Conversation list and new chat |
| ConversationItem | src/components/ConversationItem.tsx | Single conversation display |
| ChatContext | src/contexts/ChatContext.tsx | Conversation state management |
| useConversations | src/hooks/useConversations.ts | Conversation queries |
Routes:
/chat- Redirects to new conversation/chat/:conversationId- Active conversation view
Feature: Conversation Settings
| Property | Details |
|---|---|
| ID | conversation-settings |
| Status | Implemented |
| Priority | Medium |
Description: Per-conversation configuration including provider, model, temperature, max tokens, and system prompt.
User Story: As a user, I want to configure each conversation's settings so that I can customize AI behavior.
Acceptance Criteria:
- [x] Select provider (Anthropic, OpenAI, LM Studio)
- [x] Choose model within provider
- [x] Configure temperature setting
- [x] Set max tokens limit
- [x] Define custom system prompt
- [x] Persist settings per conversation
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ModelSelector | src/components/ModelSelector.tsx | Provider/model dropdown |
| Header | src/components/Header.tsx | Settings display and edit |
Category 2: Messaging
Core chat messaging with streaming
Feature: Message Sending
| Property | Details |
|---|---|
| ID | message-send |
| Status | Implemented |
| Priority | High |
Description: Send messages to AI models with streaming response display and automatic scroll.
User Story: As a user, I want to send messages and see AI responses in real-time so that I can have natural conversations.
Acceptance Criteria:
- [x] Multi-line text input with auto-resize
- [x] Enter to send, Shift+Enter for new line
- [x] Disabled during streaming
- [x] Message persisted to database
- [x] Conversation metadata updated
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ChatInput | src/components/ChatInput.tsx | Message input with send button |
| ChatPage | src/pages/ChatPage.tsx | Message orchestration |
| ChatService | src/services/ChatService.ts | Message sending logic |
Feature: Streaming Responses
| Property | Details |
|---|---|
| ID | streaming-responses |
| Status | Implemented |
| Priority | High |
Description: Real-time token-by-token response streaming with visual indicators and auto-scroll.
User Story: As a user, I want to see AI responses appear in real-time so that I don't have to wait for complete responses.
Acceptance Criteria:
- [x] Token-by-token content update
- [x] Streaming indicator visible during generation
- [x] Auto-scroll to latest content
- [x] Support for both API proxy and direct client modes
- [x] Graceful error handling
Implementation:
| Component | Path | Purpose |
|---|---|---|
| MessageList | src/components/MessageList.tsx | Message display with streaming |
| MessageBubble | src/components/MessageBubble.tsx | Individual message render |
| ChatContext | src/contexts/ChatContext.tsx | Streaming state management |
| ChatService | src/services/ChatService.ts | Stream processing |
Feature: Message Display
| Property | Details |
|---|---|
| ID | message-display |
| Status | Implemented |
| Priority | High |
Description: Rich message rendering with markdown support, syntax-highlighted code blocks, and metadata display.
User Story: As a user, I want messages to be beautifully formatted so that I can easily read code and structured content.
Acceptance Criteria:
- [x] Markdown rendering with GFM support
- [x] Syntax-highlighted code blocks
- [x] User/assistant message differentiation
- [x] Timestamp formatting (relative/absolute)
- [x] Token count and cost display (when available)
- [x] Latency metrics display
Implementation:
| Component | Path | Purpose |
|---|---|---|
| MessageBubble | src/components/MessageBubble.tsx | Message card with markdown |
| MessageList | src/components/MessageList.tsx | Scrollable message list |
Dependencies:
react-markdown- Markdown parsingremark-gfm- GitHub Flavored Markdownreact-syntax-highlighter- Code syntax highlighting
Category 3: LLM Provider Integration
Multi-provider AI model support
Feature: Anthropic Claude Support
| Property | Details |
|---|---|
| ID | provider-anthropic |
| Status | Implemented |
| Priority | High |
Description: Integration with Anthropic's Claude models including Sonnet, Opus, and Haiku variants.
User Story: As a user, I want to use Anthropic Claude models so that I can leverage their capabilities.
Acceptance Criteria:
- [x] Claude Sonnet 4.5 support
- [x] Claude Opus 4.1 support
- [x] Claude Haiku 4.5 support
- [x] Previous generation models (4.0, 4)
- [x] Streaming responses
- [x] Token usage tracking
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ChatService | src/services/ChatService.ts | Anthropic client initialization |
| ModelSelector | src/components/ModelSelector.tsx | Model selection UI |
Feature: OpenAI GPT Support
| Property | Details |
|---|---|
| ID | provider-openai |
| Status | Implemented |
| Priority | High |
Description: Integration with OpenAI's GPT models including GPT-4o, GPT-4 Turbo, and GPT-3.5 Turbo.
User Story: As a user, I want to use OpenAI GPT models so that I can access their capabilities.
Acceptance Criteria:
- [x] GPT-4o support
- [x] GPT-4 Turbo support
- [x] GPT-3.5 Turbo support
- [x] Streaming responses
- [x] Token usage tracking
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ChatService | src/services/ChatService.ts | OpenAI client initialization |
| ModelSelector | src/components/ModelSelector.tsx | Model selection UI |
Feature: LM Studio Support
| Property | Details |
|---|---|
| ID | provider-lmstudio |
| Status | Implemented |
| Priority | Medium |
Description: Integration with local models via LM Studio's OpenAI-compatible API.
User Story: As a power user, I want to use local models so that I can run AI privately.
Acceptance Criteria:
- [x] Connect to local LM Studio server
- [x] Compatible with any loaded model
- [x] Streaming responses
- [x] Configurable base URL
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ChatService | src/services/ChatService.ts | LM Studio client initialization |
| ModelSelector | src/components/ModelSelector.tsx | Model selection UI |
Category 4: MCP Tool Integration
Model Context Protocol tool execution
Feature: Tool Discovery
| Property | Details |
|---|---|
| ID | mcp-discovery |
| Status | Implemented |
| Priority | Medium |
Description: Automatic discovery of available MCP tools from the FlowState MCP Gateway.
User Story: As a user, I want tools to be automatically available so that the AI can perform actions on my behalf.
Acceptance Criteria:
- [x] Connect to MCP Gateway on initialization
- [x] Fetch available tools list
- [x] Track connection status
- [x] Support tool refresh
- [x] Graceful degradation when unavailable
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ChatService | src/services/ChatService.ts | MCP client initialization |
| ChatContext | src/contexts/ChatContext.tsx | Tool state management |
Feature: Tool Execution Display
| Property | Details |
|---|---|
| ID | tool-display |
| Status | Implemented |
| Priority | Medium |
Description: Visual display of tool calls within messages showing arguments, results, and status.
User Story: As a user, I want to see what tools the AI is using so that I can understand its actions.
Acceptance Criteria:
- [x] Display tool name and status icon
- [x] Show pending/success/error states
- [x] Expandable argument display
- [x] Expandable result display
- [x] Color-coded status borders
Implementation:
| Component | Path | Purpose |
|---|---|---|
| ToolCallIndicator | src/components/ToolCallIndicator.tsx | Tool call visualization |
| MessageBubble | src/components/MessageBubble.tsx | Tool call integration |
Category 5: User Interface
Chat interface and navigation
Feature: Sidebar Navigation
| Property | Details |
|---|---|
| ID | sidebar-nav |
| Status | Implemented |
| Priority | High |
Description: Sidebar with conversation list, new chat creation, and conversation switching.
User Story: As a user, I want a sidebar so that I can easily navigate between conversations.
Acceptance Criteria:
- [x] New Chat button at top
- [x] Scrollable conversation list
- [x] Active conversation highlighting
- [x] Relative timestamp display
- [x] Delete button on hover
- [x] Loading state indicator
Implementation:
| Component | Path | Purpose |
|---|---|---|
| Sidebar | src/components/Sidebar.tsx | Sidebar container |
| ConversationItem | src/components/ConversationItem.tsx | Conversation row |
Feature: Chat Header
| Property | Details |
|---|---|
| ID | chat-header |
| Status | Implemented |
| Priority | High |
Description: Header bar with conversation title editing, model selection, and metadata display.
User Story: As a user, I want a header so that I can see and edit conversation details.
Acceptance Criteria:
- [x] Editable conversation title
- [x] Inline edit with Enter/Escape keys
- [x] Model selector dropdown
- [x] Message count display
- [x] System prompt indicator
Implementation:
| Component | Path | Purpose |
|---|---|---|
| Header | src/components/Header.tsx | Header with title and controls |
| ModelSelector | src/components/ModelSelector.tsx | Provider/model dropdown |
Capabilities
Core Capabilities
| Capability | Description | Implementation |
|---|---|---|
| Multi-Provider Chat | Support for Anthropic, OpenAI, LM Studio | LLMClient with provider abstraction |
| Real-Time Streaming | Token-by-token response display | Stream processing with state updates |
| Persistent Storage | RxDB-backed conversation history | Reactive subscriptions with collections |
| MCP Tools | Execute tools via gateway | MCPClient with tool discovery |
| Markdown Rendering | Rich content with code highlighting | react-markdown with prism highlighter |
Technical Capabilities
| Capability | Status | Notes |
|---|---|---|
| CRUD Operations | Yes | Full conversation and message CRUD |
| Real-time Updates | Yes | RxDB reactive subscriptions |
| Offline Support | Yes | RxDB local-first architecture |
| Search & Filter | No | Not currently implemented |
| Export | No | Not currently implemented |
| Notifications | No | Toast notifications pending |
Integration Capabilities
| Integration | Type | Description |
|---|---|---|
| LLM Providers | External | Anthropic, OpenAI, LM Studio APIs |
| MCP Gateway | Internal | FlowState MCP server connection |
| RxDB Server | Internal | Replication for data sync |
| FlowState Framework | Internal | Auth, org context, layout |
Data Models
Collections Used
| Collection | Description | Operations | Primary Hook |
|---|---|---|---|
conversations | Conversation records | CRUD, Query | useConversations |
messages | Message records | CRUD, Query | useMessages |
Entity Relationships
Conversation
|
+-- Messages (hasMany)
| |
| +-- ToolCalls (embedded array)
|
+-- Settings (embedded)
|
+-- Metadata (embedded)
Key Data Flows
User Input --> ChatInput --> ChatContext --> ChatService --> LLM Provider
| |
| +--> RxDB (save message)
|
+--> Streaming Content --> MessageList
User Workflows
Workflow 1: New Conversation
Start a new chat conversation
Trigger: Click "New Chat" button
Steps:
| Step | Action | Component | Outcome |
|---|---|---|---|
| 1 | Click New Chat | Sidebar | Conversation created |
| 2 | Conversation selected | ChatContext | UI updates |
| 3 | Navigate to chat | Router | ChatPage displayed |
| 4 | Optional: Change model | ModelSelector | Settings updated |
| 5 | Type message | ChatInput | Input captured |
| 6 | Send message | ChatInput | Message sent to AI |
Success Outcome: New conversation with first message Failure Handling: Error logged, user remains on page
Workflow 2: Send Message
Send a message and receive AI response
Trigger: Press Enter or click Send button
Steps:
| Step | Action | Component | Outcome |
|---|---|---|---|
| 1 | Enter message text | ChatInput | Input captured |
| 2 | Press Enter | ChatInput | Send triggered |
| 3 | User message saved | ChatService | Persisted to RxDB |
| 4 | Stream begins | ChatService | Streaming state true |
| 5 | Tokens displayed | MessageBubble | Real-time content |
| 6 | Stream completes | ChatService | Message saved |
| 7 | Metadata updated | ChatService | Conversation updated |
Success Outcome: Complete AI response displayed Failure Handling: Error logged, streaming state reset
Workflow 3: Switch Conversation
Navigate between existing conversations
Trigger: Click conversation in sidebar
Steps:
| Step | Action | Component | Outcome |
|---|---|---|---|
| 1 | Click conversation | ConversationItem | Click handler fired |
| 2 | Set current | ChatContext | State updated |
| 3 | Load messages | useMessages | Query executed |
| 4 | Display messages | MessageList | Messages rendered |
| 5 | Update header | Header | Title/model shown |
Success Outcome: Selected conversation fully loaded Failure Handling: Error state displayed
Workflow 4: Delete Conversation
Remove a conversation and its messages
Trigger: Click delete button on conversation
Steps:
| Step | Action | Component | Outcome |
|---|---|---|---|
| 1 | Click delete | ConversationItem | Confirmation shown |
| 2 | Confirm deletion | Browser | Proceed if confirmed |
| 3 | Delete messages | ChatContext | All messages removed |
| 4 | Delete conversation | ChatContext | Conversation removed |
| 5 | Clear if current | ChatContext | Null current state |
| 6 | UI updates | Sidebar | List refreshed |
Success Outcome: Conversation fully removed Failure Handling: Error logged, no partial deletion
Integration Points
Internal Integrations
| Integration | Package | Purpose |
|---|---|---|
| FlowState Framework | @epicdm/flowstate-app-framework | App container, auth, layout |
| LLM Client | @epicdm/flowstate-agents-llm-client | Provider abstraction |
| MCP Client | @epicdm/flowstate-mcp-client | Tool execution |
| Collections | @epicdm/flowstate-collections | Data models and schemas |
| RxDB React | @epicdm/flowstate-rxdb-react | Reactive subscriptions |
| UI Components | @epicdm/flowstate-ui | Shared UI components |
External Integrations
| Integration | Type | Configuration |
|---|---|---|
| Anthropic API | REST API | API key via config |
| OpenAI API | REST API | API key via config |
| LM Studio | Local API | Base URL via config |
| react-markdown | Library | Markdown rendering |
| react-syntax-highlighter | Library | Code highlighting |
MCP Integration
| Tool | Purpose | Example Usage |
|---|---|---|
| Any MCP Tool | Tool execution | Passed to AI for invocation |
Architecture Summary
Directory Structure
src/
├── components/ # UI components (8 files)
│ ├── ChatInput.tsx # Message input
│ ├── ConversationItem.tsx # Sidebar item
│ ├── Header.tsx # Chat header
│ ├── MessageBubble.tsx # Message display
│ ├── MessageList.tsx # Message container
│ ├── ModelSelector.tsx # Provider/model select
│ ├── Sidebar.tsx # Navigation sidebar
│ └── ToolCallIndicator.tsx # Tool call display
├── contexts/ # React contexts (1 file)
│ └── ChatContext.tsx # Main chat state
├── hooks/ # Custom React hooks (2 files)
│ ├── useConversations.ts # Conversation queries
│ └── useMessages.ts # Message queries
├── pages/ # Page components (1 file)
│ └── ChatPage.tsx # Main chat page
├── services/ # Business logic (1 file)
│ └── ChatService.ts # LLM and MCP operations
├── types/ # TypeScript definitions (4 files)
│ ├── chat.ts # Main types and config
│ ├── conversation.ts # Conversation types
│ ├── message.ts # Message types
│ └── index.ts # Type exports
├── utils/ # Utility functions (1 file)
│ └── index.ts # Utilities
├── App.tsx # Main app with routing
├── plugin.ts # FlowState plugin
├── standalone.tsx # Standalone entry
└── index.ts # Package exports
Key Architectural Patterns
| Pattern | Usage | Example |
|---|---|---|
| Context Provider | State management | ChatContext for conversation state |
| Custom Hooks | Data fetching | useConversations, useMessages |
| Service Layer | Business logic | ChatService for LLM operations |
| Component Composition | UI organization | MessageList > MessageBubble |
| Plugin System | App registration | chatPlugin for HostContainer |
Services Overview
| Service | Responsibility | Key Methods |
|---|---|---|
ChatService | LLM and MCP operations | sendMessage(), executeTool(), refreshTools() |
Hooks Overview
| Hook | Purpose | Returns |
|---|---|---|
useConversations | Query conversations for org/workspace | { conversations, loading, error } |
useMessages | Query messages for conversation | { messages, loading, error } |
useChat | Access chat context | Full ChatContextValue |
Quality & Compliance
Test Coverage
| Test Type | Location | Coverage Target |
|---|---|---|
| Unit Tests | src/**/__tests__/ | >= 80% |
| Hook Tests | src/hooks/__tests__/ | >= 80% |
| Service Tests | src/services/__tests__/ | >= 80% |
| E2E Tests | playwright/ | Critical paths |
Accessibility
| Requirement | Status | Notes |
|---|---|---|
| WCAG 2.1 AA | In Progress | Components use semantic HTML |
| Keyboard Navigation | Yes | Enter/Escape for input |
| Screen Reader Support | Partial | Basic aria labels |
| Focus Management | Yes | Auto-focus on input |
Security Considerations
| Consideration | Implementation |
|---|---|
| API Keys | Passed via config, not stored in code |
| Input Validation | Basic validation on send |
| Data Protection | RxDB encryption support |
| Org Isolation | Conversations scoped by orgId/workspaceId |
FlowState Alignment
Spec-Driven Development
This app follows the FlowState Standard for spec-driven development:
- Requirements:
.flowstate/specs/{{feature-name}}/requirements.md - Design:
.flowstate/specs/{{feature-name}}/design.md - Wireframes:
.flowstate/specs/{{feature-name}}/wireframes.html - Tasks:
.flowstate/specs/{{feature-name}}/tasks.md
Quality Gates
| Gate | Requirement | Status |
|---|---|---|
| Test Coverage | >= 80% statements | In Progress |
| TypeScript Strict | No any without justification | Pass |
| Accessibility | WCAG 2.1 AA | In Progress |
| Documentation | 100% TSDoc on public APIs | In Progress |
| Security | No critical vulnerabilities | Pass |
Appendix
Related Documentation
| Document | Location | Purpose |
|---|---|---|
| FLOWSTATE.md | .flowstate/docs/FLOWSTATE.md | Development workflow |
| TDD.md | .flowstate/steering/TDD.md | Testing standards |
| QUALITY.md | .flowstate/steering/QUALITY.md | Code quality |
Version History
| Version | Date | Author | Changes |
|---|---|---|---|
| 1.0.0 | 2026-01-22 | Claude | Initial documentation |