Skip to content

Chat

The AI chat interface provides context-aware conversations powered by custom organization embeddings.

Concepts

Concept Description
Chat A conversation thread with message history
Message User or assistant message in a chat
Pinned Chat Bookmarked chat for quick access
SSE Streaming Real-time message streaming from AI

useChat Hook

hooks/use-chat.ts

Central hook for all chat operations.

const {
  // Queries
  chats,           // Chat[] - All user's chats
  pinnedChats,     // Chat[] - Pinned chats
  chat,            // Chat - Current chat detail
  messages,        // Message[] - Current chat messages

  // Loading states
  isLoadingChats,
  isLoadingChat,

  // Mutations
  createChat,      // (text: string) => Promise<Chat>
  sendMessage,     // (prompt: string) => Promise<void>
  pinChat,         // () => Promise<void>
  archiveChat,     // () => Promise<void>
  updateTitle,     // (title: string) => Promise<void>
  deleteChat       // () => Promise<void>
} = useChat(organizationSlug, chatId?);

API Endpoints

Endpoint Method Purpose
/chats GET List all chats
/chats/pinned GET List pinned chats
/chats/{id} GET Chat detail + messages
/chats POST Create chat with initial message
/chats/{id}/messages POST Send message (SSE stream)
/chats/{id}/pin POST Toggle pinned
/chats/{id}/archive POST Toggle archived
/chats/{id}/update-title POST Rename chat
/chats/{id} DELETE Delete chat

SSE Message Streaming

Messages are streamed in real-time using Server-Sent Events.

Stream Format

data: {"type": "chunk", "text": "Hello"}
data: {"type": "chunk", "text": " world"}
data: {"type": "done"}

Implementation

const sendMessage = async (prompt: string) => {
  // Optimistically add user message
  queryClient.setQueryData(["chat", chatId], (old) => ({
    ...old,
    messages: [...old.messages, { role: "user", content: prompt }]
  }));

  // Start streaming
  const response = await fetch(`/chats/${chatId}/messages`, {
    method: "POST",
    headers: { Authorization: `Bearer ${token}` },
    body: JSON.stringify({ prompt })
  });

  const reader = response.body.getReader();
  const decoder = new TextDecoder();
  let accumulatedContent = "";

  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    const text = decoder.decode(value);
    const lines = text.split("\n");

    for (const line of lines) {
      if (line.startsWith("data: ")) {
        const json = JSON.parse(line.slice(6));

        if (json.type === "chunk") {
          accumulatedContent += json.text;
          // Update cache with accumulated content
          updateAssistantMessage(accumulatedContent);
        }

        if (json.type === "done") {
          // Finalize message
          queryClient.invalidateQueries(["chat", chatId]);
        }
      }
    }
  }
};

Debounced UI Updates

To prevent excessive re-renders during streaming:

const updateAssistantMessage = useDebouncedCallback(
  (content: string) => {
    queryClient.setQueryData(["chat", chatId], (old) => ({
      ...old,
      messages: [
        ...old.messages.slice(0, -1),
        { role: "assistant", content }
      ]
    }));
  },
  50 // 50ms debounce
);

Pages

Chat Home

/portal/[slug]/[teamID]/chat

New chat interface: - Empty state with suggested prompts - Input to start new conversation - Creates chat and redirects to detail

Chat Detail

/portal/[slug]/[teamID]/chat/[chatID]

Full chat interface: - Message history - Streaming responses - Pin/unpin action - Archive action

export default function ChatPage({ params }) {
  const { chat, messages, sendMessage, pinChat } = useChat(
    params.slug,
    params.chatID
  );

  return (
    <div>
      <ChatHeader chat={chat} onPin={pinChat} />
      <MessageList messages={messages} />
      <ChatInput onSend={sendMessage} />
    </div>
  );
}

Components

ChatForm

components/forms/chat-form.tsx

Main chat interface component: - Message list with virtual scrolling - Input with send button - Loading states during streaming - Error handling

ChatModal

components/forms/chat-modal.tsx

Modal variant for inline chat access.

Pinned chats appear in the portal sidebar:

// In PortalSidebar
const { pinnedChats } = useChat(slug);

<SidebarGroup>
  <SidebarGroupLabel>Chat</SidebarGroupLabel>
  <SidebarMenu>
    <SidebarMenuItem>
      <Link href={`/portal/${slug}/${teamID}/chat`}>
        New Chat
      </Link>
    </SidebarMenuItem>
    {pinnedChats.map(chat => (
      <SidebarMenuItem key={chat.id}>
        <Link href={`/portal/${slug}/${teamID}/chat/${chat.id}`}>
          {chat.title}
        </Link>
      </SidebarMenuItem>
    ))}
  </SidebarMenu>
</SidebarGroup>

Context Awareness

The chat system uses embeddings for context:

  1. Query Processing: User message analyzed for intent
  2. Context Retrieval: Relevant embeddings searched
  3. Prompt Augmentation: Context injected into prompt
  4. Response Generation: AI responds with awareness of context

This means chats can answer questions about: - Uploaded documents - Session logs and clips - Training data - Organization-specific knowledge

Chat Lifecycle

  1. Create Chat
  2. User sends initial message
  3. New chat created with message
  4. AI streams response

  5. Continue Conversation

  6. User sends follow-up
  7. Full history maintained
  8. Context preserved across messages

  9. Organize Chats

  10. Pin important chats
  11. Archive completed conversations
  12. Rename for clarity

  13. Delete

  14. Remove chat and all messages