Chat App

Features & Capabilities

Chat App - Features & Capabilities Report

Template Version: 1.0.0 Generated: 2026-01-22 Package: @epicdm/flowstate-app-chat App ID: chat


Table of Contents

  1. Overview
  2. App Identity
  3. Feature Categories
  4. Capabilities
  5. Data Models
  6. User Workflows
  7. Integration Points
  8. Architecture Summary
  9. Quality & Compliance

Overview

Purpose

The Chat App is a multi-provider LLM chat interface for FlowState. It enables users to have real-time streaming conversations with various AI models including Anthropic Claude, OpenAI GPT, and local models via LM Studio. Conversations are persistently stored in RxDB with organization and workspace scoping, enabling team collaboration and personal conversation management.

Target Users

User TypeDescriptionPrimary Use Cases
DevelopersEngineers needing AI assistanceCode assistance, debugging, API exploration, tool execution
Knowledge WorkersBusiness professionalsDocument drafting, research, Q&A with persistent history
TeamsCollaborative groupsShared workspace conversations, organization context
Power UsersAdvanced usersLocal model integration, custom system prompts

Value Proposition

  • Multi-Provider Support: Switch between Anthropic, OpenAI, and LM Studio seamlessly
  • Real-Time Streaming: See responses token-by-token as they generate
  • Persistent History: Never lose a conversation with RxDB-backed storage
  • MCP Tool Integration: Execute tools directly from conversations
  • Markdown Excellence: Full GFM support with syntax-highlighted code blocks
  • Flexible Configuration: Per-conversation model and settings selection

App Identity

PropertyValue
IDchat
Display NameChat
Package Name@epicdm/flowstate-app-chat
Version0.1.0
Categorybusiness
Iconmessage-square
Color#06B6D4
Base Path/chat
Permissionsdatabase

Entry Points

Entry PointPathDescription
Main Exportsrc/index.tsPrimary package export
Standalonesrc/standalone.tsxStandalone app entry
App Componentsrc/App.tsxMain React component with routing
Pluginsrc/plugin.tsFlowState plugin registration

Feature Categories

Category 1: Conversation Management

Core conversation lifecycle and organization

Feature: Conversation CRUD Operations

PropertyDetails
IDconversation-crud
StatusImplemented
PriorityHigh

Description: Create, view, update, and delete conversations with provider/model selection, custom settings, and metadata tracking.

User Story: As a user, I want to manage my conversations so that I can organize my chat history effectively.

Acceptance Criteria:

  • [x] Create new conversations with default settings
  • [x] View conversations in sidebar list sorted by most recent
  • [x] Update conversation title inline
  • [x] Delete conversations with cascade message deletion
  • [x] Switch between conversations instantly
  • [x] Track message count and last activity

Implementation:

ComponentPathPurpose
Sidebarsrc/components/Sidebar.tsxConversation list and new chat
ConversationItemsrc/components/ConversationItem.tsxSingle conversation display
ChatContextsrc/contexts/ChatContext.tsxConversation state management
useConversationssrc/hooks/useConversations.tsConversation queries

Routes:

  • /chat - Redirects to new conversation
  • /chat/:conversationId - Active conversation view

Feature: Conversation Settings

PropertyDetails
IDconversation-settings
StatusImplemented
PriorityMedium

Description: Per-conversation configuration including provider, model, temperature, max tokens, and system prompt.

User Story: As a user, I want to configure each conversation's settings so that I can customize AI behavior.

Acceptance Criteria:

  • [x] Select provider (Anthropic, OpenAI, LM Studio)
  • [x] Choose model within provider
  • [x] Configure temperature setting
  • [x] Set max tokens limit
  • [x] Define custom system prompt
  • [x] Persist settings per conversation

Implementation:

ComponentPathPurpose
ModelSelectorsrc/components/ModelSelector.tsxProvider/model dropdown
Headersrc/components/Header.tsxSettings display and edit

Category 2: Messaging

Core chat messaging with streaming

Feature: Message Sending

PropertyDetails
IDmessage-send
StatusImplemented
PriorityHigh

Description: Send messages to AI models with streaming response display and automatic scroll.

User Story: As a user, I want to send messages and see AI responses in real-time so that I can have natural conversations.

Acceptance Criteria:

  • [x] Multi-line text input with auto-resize
  • [x] Enter to send, Shift+Enter for new line
  • [x] Disabled during streaming
  • [x] Message persisted to database
  • [x] Conversation metadata updated

Implementation:

ComponentPathPurpose
ChatInputsrc/components/ChatInput.tsxMessage input with send button
ChatPagesrc/pages/ChatPage.tsxMessage orchestration
ChatServicesrc/services/ChatService.tsMessage sending logic

Feature: Streaming Responses

PropertyDetails
IDstreaming-responses
StatusImplemented
PriorityHigh

Description: Real-time token-by-token response streaming with visual indicators and auto-scroll.

User Story: As a user, I want to see AI responses appear in real-time so that I don't have to wait for complete responses.

Acceptance Criteria:

  • [x] Token-by-token content update
  • [x] Streaming indicator visible during generation
  • [x] Auto-scroll to latest content
  • [x] Support for both API proxy and direct client modes
  • [x] Graceful error handling

Implementation:

ComponentPathPurpose
MessageListsrc/components/MessageList.tsxMessage display with streaming
MessageBubblesrc/components/MessageBubble.tsxIndividual message render
ChatContextsrc/contexts/ChatContext.tsxStreaming state management
ChatServicesrc/services/ChatService.tsStream processing

Feature: Message Display

PropertyDetails
IDmessage-display
StatusImplemented
PriorityHigh

Description: Rich message rendering with markdown support, syntax-highlighted code blocks, and metadata display.

User Story: As a user, I want messages to be beautifully formatted so that I can easily read code and structured content.

Acceptance Criteria:

  • [x] Markdown rendering with GFM support
  • [x] Syntax-highlighted code blocks
  • [x] User/assistant message differentiation
  • [x] Timestamp formatting (relative/absolute)
  • [x] Token count and cost display (when available)
  • [x] Latency metrics display

Implementation:

ComponentPathPurpose
MessageBubblesrc/components/MessageBubble.tsxMessage card with markdown
MessageListsrc/components/MessageList.tsxScrollable message list

Dependencies:

  • react-markdown - Markdown parsing
  • remark-gfm - GitHub Flavored Markdown
  • react-syntax-highlighter - Code syntax highlighting

Category 3: LLM Provider Integration

Multi-provider AI model support

Feature: Anthropic Claude Support

PropertyDetails
IDprovider-anthropic
StatusImplemented
PriorityHigh

Description: Integration with Anthropic's Claude models including Sonnet, Opus, and Haiku variants.

User Story: As a user, I want to use Anthropic Claude models so that I can leverage their capabilities.

Acceptance Criteria:

  • [x] Claude Sonnet 4.5 support
  • [x] Claude Opus 4.1 support
  • [x] Claude Haiku 4.5 support
  • [x] Previous generation models (4.0, 4)
  • [x] Streaming responses
  • [x] Token usage tracking

Implementation:

ComponentPathPurpose
ChatServicesrc/services/ChatService.tsAnthropic client initialization
ModelSelectorsrc/components/ModelSelector.tsxModel selection UI

Feature: OpenAI GPT Support

PropertyDetails
IDprovider-openai
StatusImplemented
PriorityHigh

Description: Integration with OpenAI's GPT models including GPT-4o, GPT-4 Turbo, and GPT-3.5 Turbo.

User Story: As a user, I want to use OpenAI GPT models so that I can access their capabilities.

Acceptance Criteria:

  • [x] GPT-4o support
  • [x] GPT-4 Turbo support
  • [x] GPT-3.5 Turbo support
  • [x] Streaming responses
  • [x] Token usage tracking

Implementation:

ComponentPathPurpose
ChatServicesrc/services/ChatService.tsOpenAI client initialization
ModelSelectorsrc/components/ModelSelector.tsxModel selection UI

Feature: LM Studio Support

PropertyDetails
IDprovider-lmstudio
StatusImplemented
PriorityMedium

Description: Integration with local models via LM Studio's OpenAI-compatible API.

User Story: As a power user, I want to use local models so that I can run AI privately.

Acceptance Criteria:

  • [x] Connect to local LM Studio server
  • [x] Compatible with any loaded model
  • [x] Streaming responses
  • [x] Configurable base URL

Implementation:

ComponentPathPurpose
ChatServicesrc/services/ChatService.tsLM Studio client initialization
ModelSelectorsrc/components/ModelSelector.tsxModel selection UI

Category 4: MCP Tool Integration

Model Context Protocol tool execution

Feature: Tool Discovery

PropertyDetails
IDmcp-discovery
StatusImplemented
PriorityMedium

Description: Automatic discovery of available MCP tools from the FlowState MCP Gateway.

User Story: As a user, I want tools to be automatically available so that the AI can perform actions on my behalf.

Acceptance Criteria:

  • [x] Connect to MCP Gateway on initialization
  • [x] Fetch available tools list
  • [x] Track connection status
  • [x] Support tool refresh
  • [x] Graceful degradation when unavailable

Implementation:

ComponentPathPurpose
ChatServicesrc/services/ChatService.tsMCP client initialization
ChatContextsrc/contexts/ChatContext.tsxTool state management

Feature: Tool Execution Display

PropertyDetails
IDtool-display
StatusImplemented
PriorityMedium

Description: Visual display of tool calls within messages showing arguments, results, and status.

User Story: As a user, I want to see what tools the AI is using so that I can understand its actions.

Acceptance Criteria:

  • [x] Display tool name and status icon
  • [x] Show pending/success/error states
  • [x] Expandable argument display
  • [x] Expandable result display
  • [x] Color-coded status borders

Implementation:

ComponentPathPurpose
ToolCallIndicatorsrc/components/ToolCallIndicator.tsxTool call visualization
MessageBubblesrc/components/MessageBubble.tsxTool call integration

Category 5: User Interface

Chat interface and navigation

Feature: Sidebar Navigation

PropertyDetails
IDsidebar-nav
StatusImplemented
PriorityHigh

Description: Sidebar with conversation list, new chat creation, and conversation switching.

User Story: As a user, I want a sidebar so that I can easily navigate between conversations.

Acceptance Criteria:

  • [x] New Chat button at top
  • [x] Scrollable conversation list
  • [x] Active conversation highlighting
  • [x] Relative timestamp display
  • [x] Delete button on hover
  • [x] Loading state indicator

Implementation:

ComponentPathPurpose
Sidebarsrc/components/Sidebar.tsxSidebar container
ConversationItemsrc/components/ConversationItem.tsxConversation row

Feature: Chat Header

PropertyDetails
IDchat-header
StatusImplemented
PriorityHigh

Description: Header bar with conversation title editing, model selection, and metadata display.

User Story: As a user, I want a header so that I can see and edit conversation details.

Acceptance Criteria:

  • [x] Editable conversation title
  • [x] Inline edit with Enter/Escape keys
  • [x] Model selector dropdown
  • [x] Message count display
  • [x] System prompt indicator

Implementation:

ComponentPathPurpose
Headersrc/components/Header.tsxHeader with title and controls
ModelSelectorsrc/components/ModelSelector.tsxProvider/model dropdown

Capabilities

Core Capabilities

CapabilityDescriptionImplementation
Multi-Provider ChatSupport for Anthropic, OpenAI, LM StudioLLMClient with provider abstraction
Real-Time StreamingToken-by-token response displayStream processing with state updates
Persistent StorageRxDB-backed conversation historyReactive subscriptions with collections
MCP ToolsExecute tools via gatewayMCPClient with tool discovery
Markdown RenderingRich content with code highlightingreact-markdown with prism highlighter

Technical Capabilities

CapabilityStatusNotes
CRUD OperationsYesFull conversation and message CRUD
Real-time UpdatesYesRxDB reactive subscriptions
Offline SupportYesRxDB local-first architecture
Search & FilterNoNot currently implemented
ExportNoNot currently implemented
NotificationsNoToast notifications pending

Integration Capabilities

IntegrationTypeDescription
LLM ProvidersExternalAnthropic, OpenAI, LM Studio APIs
MCP GatewayInternalFlowState MCP server connection
RxDB ServerInternalReplication for data sync
FlowState FrameworkInternalAuth, org context, layout

Data Models

Collections Used

CollectionDescriptionOperationsPrimary Hook
conversationsConversation recordsCRUD, QueryuseConversations
messagesMessage recordsCRUD, QueryuseMessages

Entity Relationships

Conversation
    |
    +-- Messages (hasMany)
    |       |
    |       +-- ToolCalls (embedded array)
    |
    +-- Settings (embedded)
    |
    +-- Metadata (embedded)

Key Data Flows

User Input --> ChatInput --> ChatContext --> ChatService --> LLM Provider
                                    |                |
                                    |                +--> RxDB (save message)
                                    |
                                    +--> Streaming Content --> MessageList

User Workflows

Workflow 1: New Conversation

Start a new chat conversation

Trigger: Click "New Chat" button

Steps:

StepActionComponentOutcome
1Click New ChatSidebarConversation created
2Conversation selectedChatContextUI updates
3Navigate to chatRouterChatPage displayed
4Optional: Change modelModelSelectorSettings updated
5Type messageChatInputInput captured
6Send messageChatInputMessage sent to AI

Success Outcome: New conversation with first message Failure Handling: Error logged, user remains on page


Workflow 2: Send Message

Send a message and receive AI response

Trigger: Press Enter or click Send button

Steps:

StepActionComponentOutcome
1Enter message textChatInputInput captured
2Press EnterChatInputSend triggered
3User message savedChatServicePersisted to RxDB
4Stream beginsChatServiceStreaming state true
5Tokens displayedMessageBubbleReal-time content
6Stream completesChatServiceMessage saved
7Metadata updatedChatServiceConversation updated

Success Outcome: Complete AI response displayed Failure Handling: Error logged, streaming state reset


Workflow 3: Switch Conversation

Navigate between existing conversations

Trigger: Click conversation in sidebar

Steps:

StepActionComponentOutcome
1Click conversationConversationItemClick handler fired
2Set currentChatContextState updated
3Load messagesuseMessagesQuery executed
4Display messagesMessageListMessages rendered
5Update headerHeaderTitle/model shown

Success Outcome: Selected conversation fully loaded Failure Handling: Error state displayed


Workflow 4: Delete Conversation

Remove a conversation and its messages

Trigger: Click delete button on conversation

Steps:

StepActionComponentOutcome
1Click deleteConversationItemConfirmation shown
2Confirm deletionBrowserProceed if confirmed
3Delete messagesChatContextAll messages removed
4Delete conversationChatContextConversation removed
5Clear if currentChatContextNull current state
6UI updatesSidebarList refreshed

Success Outcome: Conversation fully removed Failure Handling: Error logged, no partial deletion


Integration Points

Internal Integrations

IntegrationPackagePurpose
FlowState Framework@epicdm/flowstate-app-frameworkApp container, auth, layout
LLM Client@epicdm/flowstate-agents-llm-clientProvider abstraction
MCP Client@epicdm/flowstate-mcp-clientTool execution
Collections@epicdm/flowstate-collectionsData models and schemas
RxDB React@epicdm/flowstate-rxdb-reactReactive subscriptions
UI Components@epicdm/flowstate-uiShared UI components

External Integrations

IntegrationTypeConfiguration
Anthropic APIREST APIAPI key via config
OpenAI APIREST APIAPI key via config
LM StudioLocal APIBase URL via config
react-markdownLibraryMarkdown rendering
react-syntax-highlighterLibraryCode highlighting

MCP Integration

ToolPurposeExample Usage
Any MCP ToolTool executionPassed to AI for invocation

Architecture Summary

Directory Structure

src/
├── components/          # UI components (8 files)
│   ├── ChatInput.tsx   # Message input
│   ├── ConversationItem.tsx  # Sidebar item
│   ├── Header.tsx      # Chat header
│   ├── MessageBubble.tsx  # Message display
│   ├── MessageList.tsx # Message container
│   ├── ModelSelector.tsx  # Provider/model select
│   ├── Sidebar.tsx     # Navigation sidebar
│   └── ToolCallIndicator.tsx  # Tool call display
├── contexts/           # React contexts (1 file)
│   └── ChatContext.tsx # Main chat state
├── hooks/              # Custom React hooks (2 files)
│   ├── useConversations.ts  # Conversation queries
│   └── useMessages.ts  # Message queries
├── pages/              # Page components (1 file)
│   └── ChatPage.tsx    # Main chat page
├── services/           # Business logic (1 file)
│   └── ChatService.ts  # LLM and MCP operations
├── types/              # TypeScript definitions (4 files)
│   ├── chat.ts         # Main types and config
│   ├── conversation.ts # Conversation types
│   ├── message.ts      # Message types
│   └── index.ts        # Type exports
├── utils/              # Utility functions (1 file)
│   └── index.ts        # Utilities
├── App.tsx             # Main app with routing
├── plugin.ts           # FlowState plugin
├── standalone.tsx      # Standalone entry
└── index.ts            # Package exports

Key Architectural Patterns

PatternUsageExample
Context ProviderState managementChatContext for conversation state
Custom HooksData fetchinguseConversations, useMessages
Service LayerBusiness logicChatService for LLM operations
Component CompositionUI organizationMessageList > MessageBubble
Plugin SystemApp registrationchatPlugin for HostContainer

Services Overview

ServiceResponsibilityKey Methods
ChatServiceLLM and MCP operationssendMessage(), executeTool(), refreshTools()

Hooks Overview

HookPurposeReturns
useConversationsQuery conversations for org/workspace{ conversations, loading, error }
useMessagesQuery messages for conversation{ messages, loading, error }
useChatAccess chat contextFull ChatContextValue

Quality & Compliance

Test Coverage

Test TypeLocationCoverage Target
Unit Testssrc/**/__tests__/>= 80%
Hook Testssrc/hooks/__tests__/>= 80%
Service Testssrc/services/__tests__/>= 80%
E2E Testsplaywright/Critical paths

Accessibility

RequirementStatusNotes
WCAG 2.1 AAIn ProgressComponents use semantic HTML
Keyboard NavigationYesEnter/Escape for input
Screen Reader SupportPartialBasic aria labels
Focus ManagementYesAuto-focus on input

Security Considerations

ConsiderationImplementation
API KeysPassed via config, not stored in code
Input ValidationBasic validation on send
Data ProtectionRxDB encryption support
Org IsolationConversations scoped by orgId/workspaceId

FlowState Alignment

Spec-Driven Development

This app follows the FlowState Standard for spec-driven development:

  • Requirements: .flowstate/specs/{{feature-name}}/requirements.md
  • Design: .flowstate/specs/{{feature-name}}/design.md
  • Wireframes: .flowstate/specs/{{feature-name}}/wireframes.html
  • Tasks: .flowstate/specs/{{feature-name}}/tasks.md

Quality Gates

GateRequirementStatus
Test Coverage>= 80% statementsIn Progress
TypeScript StrictNo any without justificationPass
AccessibilityWCAG 2.1 AAIn Progress
Documentation100% TSDoc on public APIsIn Progress
SecurityNo critical vulnerabilitiesPass

Appendix

DocumentLocationPurpose
FLOWSTATE.md.flowstate/docs/FLOWSTATE.mdDevelopment workflow
TDD.md.flowstate/steering/TDD.mdTesting standards
QUALITY.md.flowstate/steering/QUALITY.mdCode quality

Version History

VersionDateAuthorChanges
1.0.02026-01-22ClaudeInitial documentation
Previous
Getting Started