# Chatbot URL: /examples/chatbot --- title: Chatbot description: An example of how to use the AI Elements to build a chatbot. --- An example of how to use the AI Elements to build a chatbot. ## Tutorial Let's walk through how to build a chatbot using AI Elements and AI SDK. Our example will include reasoning, web search with citations, and a model picker. ### Setup First, set up a new Next.js repo and cd into it by running the following command (make sure you choose to use Tailwind the project setup): ```bash title="Terminal" npx create-next-app@latest ai-chatbot && cd ai-chatbot ``` Run the following command to install AI Elements. This will also set up shadcn/ui if you haven't already configured it: ```bash title="Terminal" npx ai-elements@latest ``` Now, install the AI SDK dependencies: ```package-install npm i ai @ai-sdk/react zod ``` In order to use the providers, let's configure an AI Gateway API key. Create a `.env.local` in your root directory and navigate [here](https://vercel.com/d?to=%2F%5Bteam%5D%2F%7E%2Fai%2Fapi-keys&title=Get%20your%20AI%20Gateway%20key) to create a token, then paste it in your `.env.local`. We're now ready to start building our app! ### Client In your `app/page.tsx`, replace the code with the file below. Here, we use the `PromptInput` component with its compound components to build a rich input experience with file attachments, model picker, and action menu. The input component uses the new `PromptInputMessage` type for handling both text and file attachments. The whole chat lives in a `Conversation`. We switch on `message.parts` and render the respective part within `Message`, `Reasoning`, and `Sources`. We also use `status` from `useChat` to stream reasoning tokens, as well as render `Loader`. ```tsx title="app/page.tsx" "use client"; import { Conversation, ConversationContent, ConversationScrollButton } from "@/components/ai-elements/conversation"; import { Message, MessageContent } from "@/components/ai-elements/message"; import { PromptInput, PromptInputActionAddAttachments, PromptInputActionMenu, PromptInputActionMenuContent, PromptInputActionMenuTrigger, PromptInputAttachment, PromptInputAttachments, PromptInputBody, PromptInputButton, PromptInputHeader, type PromptInputMessage, PromptInputModelSelect, PromptInputModelSelectContent, PromptInputModelSelectItem, PromptInputModelSelectTrigger, PromptInputModelSelectValue, PromptInputSubmit, PromptInputTextarea, PromptInputFooter, PromptInputTools, } from "@/components/ai-elements/prompt-input"; import { Action, Actions } from "@/components/ai-elements/actions"; import { Fragment, useState } from "react"; import { useChat } from "@ai-sdk/react"; import { Response } from "@/components/ai-elements/response"; import { CopyIcon, GlobeIcon, RefreshCcwIcon } from "lucide-react"; import { Source, Sources, SourcesContent, SourcesTrigger } from "@/components/ai-elements/sources"; import { Reasoning, ReasoningContent, ReasoningTrigger } from "@/components/ai-elements/reasoning"; import { Loader } from "@/components/ai-elements/loader"; const models = [ { name: "GPT 4o", value: "openai/gpt-4o", }, { name: "Deepseek R1", value: "deepseek/deepseek-r1", }, ]; const ChatBotDemo = () => { const [input, setInput] = useState(""); const [model, setModel] = useState(models[0].value); const [webSearch, setWebSearch] = useState(false); const { messages, sendMessage, status, regenerate } = useChat(); const handleSubmit = (message: PromptInputMessage) => { const hasText = Boolean(message.text); const hasAttachments = Boolean(message.files?.length); if (!(hasText || hasAttachments)) { return; } sendMessage( { text: message.text || "Sent with attachments", files: message.files, }, { body: { model: model, webSearch: webSearch, }, } ); setInput(""); }; return (
{messages.map((message) => (
{message.role === "assistant" && message.parts.filter((part) => part.type === "source-url").length > 0 && ( part.type === "source-url").length} /> {message.parts .filter((part) => part.type === "source-url") .map((part, i) => ( ))} )} {message.parts.map((part, i) => { switch (part.type) { case "text": return ( {part.text} {message.role === "assistant" && i === messages.length - 1 && ( regenerate()} label="Retry"> navigator.clipboard.writeText(part.text)} label="Copy"> )} ); case "reasoning": return ( {part.text} ); default: return null; } })}
))} {status === "submitted" && }
{(attachment) => } setInput(e.target.value)} value={input} /> setWebSearch(!webSearch)}> Search { setModel(value); }} value={model} > {models.map((model) => ( {model.name} ))}
); }; export default ChatBotDemo; ``` ### Server Create a new route handler `app/api/chat/route.ts` and paste in the following code. We're using `perplexity/sonar` for web search because by default the model returns search results. We also pass `sendSources` and `sendReasoning` to `toUIMessageStreamResponse` in order to receive as parts on the frontend. The handler now also accepts file attachments from the client. ```ts title="app/api/chat/route.ts" import { streamText, UIMessage, convertToModelMessages } from "ai"; // Allow streaming responses up to 30 seconds export const maxDuration = 30; export async function POST(req: Request) { const { messages, model, webSearch, }: { messages: UIMessage[]; model: string; webSearch: boolean; } = await req.json(); const result = streamText({ model: webSearch ? "perplexity/sonar" : model, messages: convertToModelMessages(messages), system: "You are a helpful assistant that can answer questions and help with tasks", }); // send sources and reasoning back to the client return result.toUIMessageStreamResponse({ sendSources: true, sendReasoning: true, }); } ``` You now have a working chatbot app with file attachment support! The chatbot can handle both text and file inputs through the action menu. Feel free to explore other components like [`Tool`](/elements/components/tool) or [`Task`](/elements/components/task) to extend your app, or view the other examples.