Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:23:53 +08:00
commit caa3746d36
21 changed files with 4452 additions and 0 deletions

View File

@@ -0,0 +1,116 @@
# AI SDK UI - Official Documentation Links
Organized links to official AI SDK UI and React hooks documentation.
**Last Updated**: 2025-10-22
---
## AI SDK UI Documentation
### Core Hooks
- **AI SDK UI Overview:** https://ai-sdk.dev/docs/ai-sdk-ui/overview
- **useChat:** https://ai-sdk.dev/docs/ai-sdk-ui/chatbot
- **useCompletion:** https://ai-sdk.dev/docs/ai-sdk-ui/completion
- **useObject:** https://ai-sdk.dev/docs/ai-sdk-ui/object-generation
### Advanced Topics (Not Replicated in This Skill)
- **Generative UI (RSC):** https://ai-sdk.dev/docs/ai-sdk-rsc/overview
- **Stream Protocols:** https://ai-sdk.dev/docs/ai-sdk-ui/stream-protocols
- **Message Metadata:** https://ai-sdk.dev/docs/ai-sdk-ui/message-metadata
- **Custom Transports:** https://ai-sdk.dev/docs/ai-sdk-ui/transports
---
## Next.js Integration
- **Next.js App Router:** https://ai-sdk.dev/docs/getting-started/nextjs-app-router
- **Next.js Pages Router:** https://ai-sdk.dev/docs/getting-started/nextjs-pages-router
- **Next.js Documentation:** https://nextjs.org/docs
---
## Migration & Troubleshooting
- **v4 → v5 Migration Guide:** https://ai-sdk.dev/docs/migration-guides/migration-guide-5-0
- **Troubleshooting Guide:** https://ai-sdk.dev/docs/troubleshooting
- **Common Issues:** https://ai-sdk.dev/docs/troubleshooting/common-issues
- **All Error Types (28 total):** https://ai-sdk.dev/docs/reference/ai-sdk-errors
---
## API Reference
- **useChat API:** https://ai-sdk.dev/docs/reference/ai-sdk-ui/use-chat
- **useCompletion API:** https://ai-sdk.dev/docs/reference/ai-sdk-ui/use-completion
- **useObject API:** https://ai-sdk.dev/docs/reference/ai-sdk-ui/use-object
---
## Vercel Deployment
- **Vercel Functions:** https://vercel.com/docs/functions
- **Streaming on Vercel:** https://vercel.com/docs/functions/streaming
- **Environment Variables:** https://vercel.com/docs/projects/environment-variables
- **AI SDK 5.0 Release:** https://vercel.com/blog/ai-sdk-5
---
## GitHub & Community
- **GitHub Repository:** https://github.com/vercel/ai
- **GitHub Issues:** https://github.com/vercel/ai/issues
- **GitHub Discussions:** https://github.com/vercel/ai/discussions
- **Discord Community:** https://discord.gg/vercel
---
## TypeScript & React
- **TypeScript Handbook:** https://www.typescriptlang.org/docs/
- **React Documentation:** https://react.dev
---
## Complementary Skills
For complete AI SDK coverage, also see:
- **ai-sdk-core skill:** Backend text generation, structured output, tools, agents
- **cloudflare-workers-ai skill:** Native Cloudflare Workers AI binding (no multi-provider)
---
## Quick Navigation
### I want to...
**Build a chat interface:**
- Docs: https://ai-sdk.dev/docs/ai-sdk-ui/chatbot
- Template: `templates/use-chat-basic.tsx`
**Stream text completions:**
- Docs: https://ai-sdk.dev/docs/ai-sdk-ui/completion
- Template: `templates/use-completion-basic.tsx`
**Generate structured output:**
- Docs: https://ai-sdk.dev/docs/ai-sdk-ui/object-generation
- Template: `templates/use-object-streaming.tsx`
**Migrate from v4:**
- Docs: https://ai-sdk.dev/docs/migration-guides/migration-guide-5-0
- Reference: `references/use-chat-migration.md`
**Fix a UI error:**
- Reference: `references/top-ui-errors.md`
- Docs: https://ai-sdk.dev/docs/reference/ai-sdk-errors
**Deploy to production:**
- Reference: `references/nextjs-integration.md`
- Docs: https://vercel.com/docs/functions/streaming
---
**Last Updated**: 2025-10-22

View File

@@ -0,0 +1,247 @@
# AI SDK UI - Next.js Integration
Complete guide for integrating AI SDK UI with Next.js.
**Last Updated**: 2025-10-22
---
## App Router (Next.js 13+)
### Directory Structure
```
app/
├── api/
│ └── chat/
│ └── route.ts # API route
├── chat/
│ └── page.tsx # Chat page (Client Component)
└── layout.tsx
```
### Chat Page (Client Component)
```tsx
// app/chat/page.tsx
'use client'; // REQUIRED
import { useChat } from 'ai/react';
import { useState } from 'react';
export default function ChatPage() {
const { messages, sendMessage, isLoading } = useChat({
api: '/api/chat',
});
const [input, setInput] = useState('');
const handleSubmit = (e) => {
e.preventDefault();
sendMessage({ content: input });
setInput('');
};
return (
<div>
{messages.map(m => <div key={m.id}>{m.content}</div>)}
<form onSubmit={handleSubmit}>
<input value={input} onChange={(e) => setInput(e.target.value)} />
</form>
</div>
);
}
```
### API Route
```typescript
// app/api/chat/route.ts
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: openai('gpt-4-turbo'),
messages,
});
return result.toDataStreamResponse(); // App Router method
}
```
---
## Pages Router (Next.js 12 and earlier)
### Directory Structure
```
pages/
├── api/
│ └── chat.ts # API route
└── chat.tsx # Chat page
```
### Chat Page
```tsx
// pages/chat.tsx
import { useChat } from 'ai/react';
import { useState } from 'react';
export default function ChatPage() {
const { messages, sendMessage, isLoading } = useChat({
api: '/api/chat',
});
const [input, setInput] = useState('');
const handleSubmit = (e) => {
e.preventDefault();
sendMessage({ content: input });
setInput('');
};
return (
<div>
{messages.map(m => <div key={m.id}>{m.content}</div>)}
<form onSubmit={handleSubmit}>
<input value={input} onChange={(e) => setInput(e.target.value)} />
</form>
</div>
);
}
```
### API Route
```typescript
// pages/api/chat.ts
import type { NextApiRequest, NextApiResponse } from 'next';
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
export default async function handler(
req: NextApiRequest,
res: NextApiResponse
) {
const { messages } = req.body;
const result = streamText({
model: openai('gpt-4-turbo'),
messages,
});
return result.pipeDataStreamToResponse(res); // Pages Router method
}
```
---
## Key Differences
| Feature | App Router | Pages Router |
|---------|------------|--------------|
| Route Handler | `app/api/chat/route.ts` | `pages/api/chat.ts` |
| Stream Method | `toDataStreamResponse()` | `pipeDataStreamToResponse()` |
| Client Directive | Requires `'use client'` | Not required |
| Server Components | Supported | Not supported |
---
## Environment Variables
### .env.local
```bash
# Required
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_GENERATIVE_AI_API_KEY=...
# Optional
NODE_ENV=development
```
### Accessing in API Routes
```typescript
// App Router
export async function POST(req: Request) {
const apiKey = process.env.OPENAI_API_KEY;
// ...
}
// Pages Router
export default async function handler(req, res) {
const apiKey = process.env.OPENAI_API_KEY;
// ...
}
```
---
## Deployment to Vercel
### 1. Add Environment Variables
In Vercel Dashboard:
1. Go to Settings → Environment Variables
2. Add `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc.
3. Select environments (Production, Preview, Development)
### 2. Deploy
```bash
npm run build
vercel deploy
```
Vercel auto-detects streaming and configures appropriately.
### 3. Verify Streaming
Check response headers:
- `Transfer-Encoding: chunked`
- `X-Vercel-Streaming: true`
**Docs**: https://vercel.com/docs/functions/streaming
---
## Common Issues
### Issue: "useChat is not defined"
**Cause**: Not importing from correct package.
**Fix**:
```tsx
import { useChat } from 'ai/react'; // ✅ Correct
import { useChat } from 'ai'; // ❌ Wrong
```
### Issue: "Cannot use 'use client' directive"
**Cause**: Using `'use client'` in Pages Router.
**Fix**: Remove `'use client'` - only needed in App Router.
### Issue: "API route returns 405 Method Not Allowed"
**Cause**: Using GET instead of POST.
**Fix**: Ensure API route exports `POST` function (App Router) or checks `req.method === 'POST'` (Pages Router).
---
## Official Documentation
- **App Router**: https://ai-sdk.dev/docs/getting-started/nextjs-app-router
- **Pages Router**: https://ai-sdk.dev/docs/getting-started/nextjs-pages-router
- **Next.js Docs**: https://nextjs.org/docs
---
**Last Updated**: 2025-10-22

View File

@@ -0,0 +1,433 @@
# AI SDK UI - Streaming Best Practices
UI patterns and best practices for streaming AI responses.
**Last Updated**: 2025-10-22
---
## Performance
### Always Use Streaming for Long-Form Content
```tsx
// ✅ GOOD: Streaming provides better perceived performance
const { messages } = useChat({ api: '/api/chat' });
// ❌ BAD: Blocking - user waits for entire response
const response = await fetch('/api/chat', { method: 'POST' });
```
**Why?**
- Users see tokens as they arrive
- Perceived performance is much faster
- Users can start reading before response completes
- Can stop generation early
---
## UX Patterns
### 1. Show Loading States
```tsx
const { messages, isLoading } = useChat();
{isLoading && (
<div className="flex space-x-2">
<div className="w-2 h-2 bg-gray-500 rounded-full animate-bounce" />
<div className="w-2 h-2 bg-gray-500 rounded-full animate-bounce delay-100" />
<div className="w-2 h-2 bg-gray-500 rounded-full animate-bounce delay-200" />
</div>
)}
```
### 2. Provide Stop Button
```tsx
const { isLoading, stop } = useChat();
{isLoading && (
<button onClick={stop} className="bg-red-500 text-white px-4 py-2 rounded">
Stop Generation
</button>
)}
```
### 3. Auto-Scroll to Latest Message
```tsx
const messagesEndRef = useRef<HTMLDivElement>(null);
useEffect(() => {
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
}, [messages]);
<div ref={messagesEndRef} />
```
### 4. Disable Input While Loading
```tsx
<input
value={input}
onChange={(e) => setInput(e.target.value)}
disabled={isLoading} // Prevent new messages while generating
className="disabled:bg-gray-100"
/>
```
### 5. Handle Empty States
```tsx
{messages.length === 0 ? (
<div className="text-center">
<h2>Start a conversation</h2>
<p>Ask me anything!</p>
</div>
) : (
// Messages list
)}
```
---
## Error Handling
### 1. Display Errors to Users
```tsx
const { error } = useChat();
{error && (
<div className="p-4 bg-red-50 text-red-700 rounded">
<strong>Error:</strong> {error.message}
</div>
)}
```
### 2. Provide Retry Functionality
```tsx
const { error, reload } = useChat();
{error && (
<div className="flex items-center justify-between p-4 bg-red-50">
<span>{error.message}</span>
<button onClick={reload} className="px-3 py-1 border rounded">
Retry
</button>
</div>
)}
```
### 3. Handle Network Failures Gracefully
```tsx
useChat({
onError: (error) => {
console.error('Chat error:', error);
// Log to monitoring service (Sentry, etc.)
// Show user-friendly message
},
});
```
### 4. Log Errors for Debugging
```tsx
useChat({
onError: (error) => {
const errorLog = {
timestamp: new Date().toISOString(),
message: error.message,
url: window.location.href,
};
console.error('AI SDK Error:', errorLog);
// Send to Sentry/Datadog/etc.
},
});
```
---
## Message Rendering
### 1. Support Markdown
Use `react-markdown` for rich content:
```tsx
import ReactMarkdown from 'react-markdown';
{messages.map(m => (
<ReactMarkdown>{m.content}</ReactMarkdown>
))}
```
### 2. Handle Code Blocks
```tsx
import { Prism as SyntaxHighlighter } from 'react-syntax-highlighter';
<ReactMarkdown
components={{
code({ node, inline, className, children, ...props }) {
const match = /language-(\w+)/.exec(className || '');
return !inline && match ? (
<SyntaxHighlighter language={match[1]}>
{String(children)}
</SyntaxHighlighter>
) : (
<code className={className} {...props}>
{children}
</code>
);
},
}}
>
{message.content}
</ReactMarkdown>
```
### 3. Display Tool Calls Visually
```tsx
{message.toolInvocations?.map((tool, idx) => (
<div key={idx} className="bg-blue-50 border border-blue-200 p-3 rounded">
<div className="font-semibold">Tool: {tool.toolName}</div>
<div className="text-sm">Args: {JSON.stringify(tool.args)}</div>
{tool.result && (
<div className="text-sm">Result: {JSON.stringify(tool.result)}</div>
)}
</div>
))}
```
### 4. Show Timestamps
```tsx
<div className="text-xs text-gray-500">
{new Date(message.createdAt).toLocaleTimeString()}
</div>
```
### 5. Group Messages by Role
```tsx
{messages.reduce((groups, message, idx) => {
const prevMessage = messages[idx - 1];
const showRole = !prevMessage || prevMessage.role !== message.role;
return [
...groups,
<div key={message.id}>
{showRole && <div className="font-bold">{message.role}</div>}
<div>{message.content}</div>
</div>
];
}, [])}
```
---
## State Management
### 1. Persist Chat History
```tsx
const chatId = 'chat-123';
const { messages } = useChat({
id: chatId,
initialMessages: loadFromLocalStorage(chatId),
});
useEffect(() => {
saveToLocalStorage(chatId, messages);
}, [messages, chatId]);
```
### 2. Clear Chat Functionality
```tsx
const { setMessages } = useChat();
const clearChat = () => {
if (confirm('Clear chat history?')) {
setMessages([]);
}
};
```
### 3. Export/Import Conversations
```tsx
const exportChat = () => {
const json = JSON.stringify(messages, null, 2);
const blob = new Blob([json], { type: 'application/json' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = `chat-${Date.now()}.json`;
a.click();
};
const importChat = (file: File) => {
const reader = new FileReader();
reader.onload = (e) => {
const imported = JSON.parse(e.target?.result as string);
setMessages(imported);
};
reader.readAsText(file);
};
```
### 4. Handle Multiple Chats (Routing)
```tsx
// Use URL params for chat ID
const searchParams = useSearchParams();
const chatId = searchParams.get('chatId') || 'default';
const { messages } = useChat({
id: chatId,
initialMessages: loadMessages(chatId),
});
// Navigation
<Link href={`/chat?chatId=${newChatId}`}>New Chat</Link>
```
---
## Advanced Patterns
### 1. Debounced Input for Completions
```tsx
import { useDebouncedCallback } from 'use-debounce';
const { complete } = useCompletion();
const debouncedComplete = useDebouncedCallback((value) => {
complete(value);
}, 500);
<input onChange={(e) => debouncedComplete(e.target.value)} />
```
### 2. Optimistic Updates
```tsx
const { messages, sendMessage } = useChat();
const optimisticSend = (content: string) => {
// Add user message immediately
const tempMessage = {
id: `temp-${Date.now()}`,
role: 'user',
content,
};
setMessages([...messages, tempMessage]);
// Send to server
sendMessage({ content });
};
```
### 3. Custom Message Formatting
```tsx
const formatMessage = (content: string) => {
// Replace @mentions
content = content.replace(/@(\w+)/g, '<span class="mention">@$1</span>');
// Replace URLs
content = content.replace(
/(https?:\/\/[^\s]+)/g,
'<a href="$1" target="_blank">$1</a>'
);
return content;
};
<div dangerouslySetInnerHTML={{ __html: formatMessage(message.content) }} />
```
### 4. Typing Indicators
```tsx
const [isTyping, setIsTyping] = useState(false);
useChat({
onFinish: () => setIsTyping(false),
});
const handleSend = (content: string) => {
setIsTyping(true);
sendMessage({ content });
};
{isTyping && <div className="text-gray-500 italic">AI is typing...</div>}
```
---
## Performance Optimization
### 1. Virtualize Long Message Lists
```tsx
import { FixedSizeList } from 'react-window';
<FixedSizeList
height={600}
itemCount={messages.length}
itemSize={100}
width="100%"
>
{({ index, style }) => (
<div style={style}>
{messages[index].content}
</div>
)}
</FixedSizeList>
```
### 2. Lazy Load Message History
```tsx
const [page, setPage] = useState(1);
const messagesPerPage = 50;
const visibleMessages = messages.slice(
(page - 1) * messagesPerPage,
page * messagesPerPage
);
```
### 3. Memoize Message Rendering
```tsx
import { memo } from 'react';
const MessageComponent = memo(({ message }: { message: Message }) => {
return <div>{message.content}</div>;
});
{messages.map(m => <MessageComponent key={m.id} message={m} />)}
```
---
## Official Documentation
- **AI SDK UI Overview**: https://ai-sdk.dev/docs/ai-sdk-ui/overview
- **Streaming Protocols**: https://ai-sdk.dev/docs/ai-sdk-ui/stream-protocols
- **Message Metadata**: https://ai-sdk.dev/docs/ai-sdk-ui/message-metadata
---
**Last Updated**: 2025-10-22

303
references/top-ui-errors.md Normal file
View File

@@ -0,0 +1,303 @@
# AI SDK UI - Top 12 Errors & Solutions
Common AI SDK UI errors with actionable solutions.
**Last Updated**: 2025-10-22
---
## 1. useChat Failed to Parse Stream
**Error**: `SyntaxError: Unexpected token in JSON at position X`
**Cause**: API route not returning proper stream format.
**Solution**:
```typescript
// ✅ CORRECT (App Router)
export async function POST(req: Request) {
const result = streamText({ /* ... */ });
return result.toDataStreamResponse(); // Correct method
}
// ✅ CORRECT (Pages Router)
export default async function handler(req, res) {
const result = streamText({ /* ... */ });
return result.pipeDataStreamToResponse(res); // Correct method
}
// ❌ WRONG
return new Response(result.textStream); // Missing stream protocol
```
---
## 2. useChat No Response
**Cause**: API route not streaming correctly or wrong method.
**Solution**:
```typescript
// Check 1: Are you using the right method?
// App Router: toDataStreamResponse()
// Pages Router: pipeDataStreamToResponse()
// Check 2: Is your API route returning a Response?
export async function POST(req: Request) {
const result = streamText({ model: openai('gpt-4'), messages });
return result.toDataStreamResponse(); // Must return this!
}
// Check 3: Check network tab - is the request completing?
// If status is 200 but no data: likely streaming issue
```
---
## 3. Unclosed Streams
**Cause**: Stream not properly closed in API.
**Solution**:
```typescript
// ✅ GOOD: SDK handles closing automatically
export async function POST(req: Request) {
const result = streamText({ model: openai('gpt-4'), messages });
return result.toDataStreamResponse();
}
// ❌ BAD: Manual stream handling (error-prone)
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
// ...must manually close!
controller.close();
}
});
```
**GitHub Issue**: #4123
---
## 4. Streaming Not Working When Deployed
**Cause**: Deployment platform buffering responses.
**Solution**:
- **Vercel**: Auto-detects streaming (no config needed)
- **Netlify**: Ensure Edge Functions enabled
- **Cloudflare Workers**: Use `toDataStreamResponse()`
- **Other platforms**: Check for response buffering settings
```typescript
// Vercel - works out of the box
export async function POST(req: Request) {
const result = streamText({ /* ... */ });
return result.toDataStreamResponse();
}
```
**Docs**: https://vercel.com/docs/functions/streaming
---
## 5. Streaming Not Working When Proxied
**Cause**: Proxy (nginx, Cloudflare, etc.) buffering responses.
**Solution**:
**Nginx**:
```nginx
location /api/ {
proxy_pass http://localhost:3000;
proxy_buffering off; # Disable buffering
proxy_cache off;
}
```
**Cloudflare**: Disable "Auto Minify" in dashboard
---
## 6. Strange Stream Output (0:... characters)
**Error**: Seeing raw stream protocol like `0:"Hello"` in browser.
**Cause**: Not using correct hook or consuming stream directly.
**Solution**:
```tsx
// ✅ CORRECT: Use useChat hook
const { messages } = useChat({ api: '/api/chat' });
// ❌ WRONG: Consuming stream directly
const response = await fetch('/api/chat');
const reader = response.body.getReader(); // Don't do this!
```
---
## 7. Stale Body Values with useChat
**Cause**: `body` captured at first render only.
**Solution**:
```tsx
// ❌ BAD: body captured once
const { userId } = useUser();
const { messages } = useChat({
body: { userId }, // Stale! Won't update if userId changes
});
// ✅ GOOD: Use data in sendMessage
const { userId } = useUser();
const { messages, sendMessage } = useChat();
sendMessage({
content: input,
data: { userId }, // Fresh value on each send
});
```
---
## 8. Custom Headers Not Working with useChat
**Cause**: Headers not passed correctly.
**Solution**:
```tsx
// ✅ CORRECT
const { messages } = useChat({
headers: {
'Authorization': `Bearer ${token}`,
'X-Custom-Header': 'value',
},
});
// OR use fetch options
const { messages } = useChat({
fetch: (url, options) => {
return fetch(url, {
...options,
headers: {
...options.headers,
'Authorization': `Bearer ${token}`,
},
});
},
});
```
---
## 9. React Maximum Update Depth
**Error**: `Maximum update depth exceeded`
**Cause**: Infinite loop in useEffect.
**Solution**:
```tsx
// ❌ BAD: Infinite loop
const saveMessages = (messages) => { /* ... */ };
useEffect(() => {
saveMessages(messages);
}, [messages, saveMessages]); // saveMessages changes every render!
// ✅ GOOD: Only depend on messages
useEffect(() => {
localStorage.setItem('messages', JSON.stringify(messages));
}, [messages]); // saveMessages not needed in deps
```
---
## 10. Repeated Assistant Messages
**Cause**: Duplicate message handling or multiple sendMessage calls.
**Solution**:
```tsx
// ❌ BAD: Calling sendMessage multiple times
const handleSubmit = (e) => {
e.preventDefault();
sendMessage({ content: input });
sendMessage({ content: input }); // Duplicate!
};
// ✅ GOOD: Single call
const handleSubmit = (e) => {
e.preventDefault();
if (!input.trim()) return; // Guard
sendMessage({ content: input });
setInput('');
};
```
---
## 11. onFinish Not Called When Stream Aborted
**Cause**: Stream abort doesn't trigger onFinish callback.
**Solution**:
```tsx
const { stop } = useChat({
onFinish: (message) => {
console.log('Finished:', message);
},
});
// Handle abort separately
const handleStop = () => {
stop();
console.log('Stream aborted by user');
// Do cleanup here
};
```
---
## 12. Type Error with Message Parts (v5)
**Error**: `Property 'parts' does not exist on type 'Message'`
**Cause**: v5 changed message structure for tool calls.
**Solution**:
```tsx
// ✅ CORRECT (v5)
messages.map(message => {
// Use content for simple messages
if (message.content) {
return <div>{message.content}</div>;
}
// Use toolInvocations for tool calls
if (message.toolInvocations) {
return message.toolInvocations.map(tool => (
<div key={tool.toolCallId}>
Tool: {tool.toolName}
</div>
));
}
});
// ❌ WRONG (v4 style)
message.toolCalls // Doesn't exist in v5
```
---
## For More Errors
See complete error reference (28 total types):
https://ai-sdk.dev/docs/reference/ai-sdk-errors
---
**Last Updated**: 2025-10-22

View File

@@ -0,0 +1,432 @@
# useChat v4 → v5 Migration Guide
Complete guide to migrating from AI SDK v4 to v5 for UI hooks.
**Last Updated**: 2025-10-22
**Applies to**: AI SDK v5.0+
---
## Critical Breaking Change
**BREAKING: useChat no longer manages input state!**
In v4, `useChat` provided `input`, `handleInputChange`, and `handleSubmit`. In v5, you must manage input state manually using `useState`.
---
## Quick Migration Checklist
- [ ] Replace `input`, `handleInputChange`, `handleSubmit` with manual state
- [ ] Change `append()` to `sendMessage()`
- [ ] Replace `onResponse` with `onFinish`
- [ ] Move `initialMessages` to controlled mode with `messages` prop
- [ ] Remove `maxSteps` (handle server-side)
- [ ] Update message rendering for parts structure (if using tools)
---
## 1. Input State Management (CRITICAL)
### v4 (OLD)
```tsx
import { useChat } from 'ai/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: '/api/chat',
});
return (
<div>
{messages.map(m => <div key={m.id}>{m.content}</div>)}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
</form>
</div>
);
}
```
### v5 (NEW)
```tsx
import { useChat } from 'ai/react';
import { useState, FormEvent } from 'react';
export default function Chat() {
const { messages, sendMessage } = useChat({
api: '/api/chat',
});
// Manual input state
const [input, setInput] = useState('');
const handleSubmit = (e: FormEvent) => {
e.preventDefault();
sendMessage({ content: input });
setInput('');
};
return (
<div>
{messages.map(m => <div key={m.id}>{m.content}</div>)}
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
/>
</form>
</div>
);
}
```
**Why?**
- More control over input handling
- Easier to add features like debouncing, validation, etc.
- Consistent with React patterns
---
## 2. append() → sendMessage()
### v4 (OLD)
```tsx
const { append } = useChat();
// Append a message
append({
role: 'user',
content: 'Hello',
});
```
### v5 (NEW)
```tsx
const { sendMessage } = useChat();
// Send a message (role is assumed to be 'user')
sendMessage({
content: 'Hello',
});
// With attachments
sendMessage({
content: 'Analyze this image',
experimental_attachments: [
{ name: 'image.png', contentType: 'image/png', url: 'blob:...' },
],
});
```
**Why?**
- Clearer API: `sendMessage` is more intuitive than `append`
- Supports attachments natively
- Role is always 'user' (no need to specify)
---
## 3. onResponse → onFinish
### v4 (OLD)
```tsx
const { messages } = useChat({
onResponse: (response) => {
console.log('Response received:', response);
},
});
```
### v5 (NEW)
```tsx
const { messages } = useChat({
onFinish: (message, options) => {
console.log('Response finished:', message);
console.log('Finish reason:', options.finishReason);
console.log('Usage:', options.usage);
},
});
```
**Why?**
- `onResponse` fired too early (when response started)
- `onFinish` fires when response is complete
- Provides more context (usage, finish reason)
---
## 4. initialMessages → Controlled Mode
### v4 (OLD)
```tsx
const { messages } = useChat({
initialMessages: [
{ role: 'system', content: 'You are a helpful assistant.' },
],
});
```
### v5 (NEW - Option 1: Uncontrolled)
```tsx
const { messages } = useChat({
// Use initialMessages for read-only initialization
initialMessages: [
{ role: 'system', content: 'You are a helpful assistant.' },
],
});
```
### v5 (NEW - Option 2: Controlled)
```tsx
const [messages, setMessages] = useState([
{ role: 'system', content: 'You are a helpful assistant.' },
]);
const { sendMessage } = useChat({
messages, // Pass messages for controlled mode
onUpdate: ({ messages }) => {
setMessages(messages); // Sync state
},
});
```
**Why?**
- Clearer distinction between controlled and uncontrolled
- Easier to persist messages to database
---
## 5. maxSteps Removed
### v4 (OLD)
```tsx
const { messages } = useChat({
maxSteps: 5, // Limit agent steps
});
```
### v5 (NEW)
Handle `maxSteps` (or `stopWhen`) on the **server-side** only:
```typescript
// app/api/chat/route.ts
import { streamText, stopWhen } from 'ai';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: openai('gpt-4'),
messages,
maxSteps: 5, // Handle on server
});
return result.toDataStreamResponse();
}
```
**Why?**
- Server has more control over costs
- Prevents client-side bypass
- Consistent with v5 architecture
---
## 6. Message Structure (for Tools)
### v4 (OLD)
```tsx
// Simple message structure
{
id: '1',
role: 'assistant',
content: 'The weather is sunny',
toolCalls: [...] // Tool calls as separate property
}
```
### v5 (NEW)
```tsx
// Parts-based structure
{
id: '1',
role: 'assistant',
content: 'The weather is sunny', // Still exists for simple messages
parts: [
{ type: 'text', content: 'The weather is' },
{ type: 'tool-call', toolName: 'getWeather', args: { location: 'SF' } },
{ type: 'tool-result', toolName: 'getWeather', result: { temp: 72 } },
{ type: 'text', content: 'sunny' },
]
}
```
**Rendering v5 Messages:**
```tsx
messages.map(message => {
// For simple text messages, use content
if (message.content) {
return <div>{message.content}</div>;
}
// For tool calls, use toolInvocations
if (message.toolInvocations) {
return message.toolInvocations.map(tool => (
<div key={tool.toolCallId}>
Tool: {tool.toolName}
Args: {JSON.stringify(tool.args)}
Result: {JSON.stringify(tool.result)}
</div>
));
}
});
```
---
## 7. Other Removed/Changed Properties
### Removed in v5
- `input` - Use manual `useState`
- `handleInputChange` - Use `onChange={(e) => setInput(e.target.value)}`
- `handleSubmit` - Use custom submit handler
- `onResponse` - Use `onFinish` instead
### Renamed in v5
- `append()``sendMessage()`
- `initialMessages` → Still exists, but use `messages` prop for controlled mode
### Added in v5
- `sendMessage()` - New way to send messages
- `experimental_attachments` - File attachments support
- `toolInvocations` - Simplified tool call rendering
---
## Common Migration Patterns
### Pattern 1: Basic Chat
**v4:**
```tsx
const { messages, input, handleInputChange, handleSubmit } = useChat();
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
</form>
```
**v5:**
```tsx
const { messages, sendMessage } = useChat();
const [input, setInput] = useState('');
<form onSubmit={(e) => {
e.preventDefault();
sendMessage({ content: input });
setInput('');
}}>
<input value={input} onChange={(e) => setInput(e.target.value)} />
</form>
```
### Pattern 2: With Initial Messages
**v4:**
```tsx
const { messages } = useChat({
initialMessages: loadFromStorage(),
});
```
**v5:**
```tsx
const { messages } = useChat({
initialMessages: loadFromStorage(), // Still works
});
```
### Pattern 3: With Response Callback
**v4:**
```tsx
useChat({
onResponse: (res) => console.log('Started'),
});
```
**v5:**
```tsx
useChat({
onFinish: (msg, opts) => {
console.log('Finished');
console.log('Tokens:', opts.usage.totalTokens);
},
});
```
---
## Migration Troubleshooting
### Error: "input is undefined"
**Cause**: You're using v5 but trying to access `input` from `useChat`.
**Fix**: Add manual input state:
```tsx
const [input, setInput] = useState('');
```
### Error: "append is not a function"
**Cause**: `append()` was renamed to `sendMessage()` in v5.
**Fix**: Replace all instances of `append()` with `sendMessage()`.
### Error: "handleSubmit is undefined"
**Cause**: v5 doesn't provide `handleSubmit`.
**Fix**: Create custom submit handler:
```tsx
const handleSubmit = (e: FormEvent) => {
e.preventDefault();
sendMessage({ content: input });
setInput('');
};
```
### Warning: "onResponse is deprecated"
**Cause**: v5 removed `onResponse`.
**Fix**: Use `onFinish` instead.
---
## Official Migration Resources
- **v5 Migration Guide**: https://ai-sdk.dev/docs/migration-guides/migration-guide-5-0
- **useChat API Reference**: https://ai-sdk.dev/docs/reference/ai-sdk-ui/use-chat
- **v5 Release Notes**: https://vercel.com/blog/ai-sdk-5
---
**Last Updated**: 2025-10-22