Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions content/docs/02-getting-started/07-expo.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -172,6 +172,14 @@ Pick the approach that best matches how you want to manage providers across your

Now that you have an API route that can query an LLM, it's time to setup your frontend. The AI SDK's [ UI ](/docs/ai-sdk-ui) package abstracts the complexity of a chat interface into one hook, [`useChat`](/docs/reference/ai-sdk-ui/use-chat).

<Note>
If you need high-performance rich rendering on-device (e.g. markdown/MDX in
React Native), you can stream a pre-parsed JSON “document tree” and
incremental patches alongside the model output using [`data-*` UI
parts](/docs/ai-sdk-ui/streaming-data#streaming-a-json-document-tree--patches-great-for-react-native).
This keeps the `useChat` DX while doing less work on the client.
</Note>

Update your root page (`app/(tabs)/index.tsx`) with the following code to show a list of chat messages and provide a user message input:

```tsx filename="app/(tabs)/index.tsx"
Expand Down
121 changes: 121 additions & 0 deletions content/docs/04-ai-sdk-ui/20-streaming-data.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -192,6 +192,127 @@ When you write to a data part with the same ID, the client automatically reconci

The reconciliation happens automatically - simply use the same `id` when writing to the stream.

## Streaming a JSON “document tree” + patches (great for React Native)

If markdown/MDX parsing is expensive on the client (common in React Native), you can do **less work on-device** by streaming a **pre-parsed JSON tree** (AST) from the server and then streaming **patches** as the model response streams.

This pattern keeps the `useChat` DX (tools, status, stop/regenerate, persistence) but lets the UI render from structured data instead of parsing markdown strings.

### Recommended shape

- **`data-mdxTree` (persistent)**: occasional full snapshots of the current tree (use `id` so it updates in place)
- **`data-mdxPatch` (transient)**: frequent incremental patches (set `transient: true` so message history does not grow)

You can use any patch protocol you want (JSON Patch, jsondiffpatch, a custom protocol). The SDK will deliver these `data-*` chunks through `onData`.

To make this easier, the `ai` package ships a minimal, JSON-serializable tree format and patch helpers:

- `MdxTree` / `MdxTreeNode`
- `MdxTreePatch`
- `applyMdxTreePatch()` / `applyMdxTreePatches()`

And `@ai-sdk/react` provides a renderer:

- `renderMdxTree()`

If you parse MDX with remark/unified on the server, you can also use:

- `convertRemarkMdxToMdxTree()`

### Server example

```ts filename="app/api/chat/route.ts"
import { streamText, convertToModelMessages } from 'ai';
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai';
import { openai } from '@ai-sdk/openai';

type MdxTree = unknown; // your AST type (or use `MdxTree` from 'ai')
type Patch = unknown; // your patch protocol (or use `MdxTreePatch` from 'ai')

export async function POST(req: Request) {
const { messages } = await req.json();
const modelMessages = await convertToModelMessages(messages);

const stream = createUIMessageStream({
execute: ({ writer }) => {
// 1) Initial tree snapshot (stored on the message)
writer.write({
type: 'data-mdxTree',
id: 'root',
data: { tree: /* initial AST */ null as unknown as MdxTree },
});

const result = streamText({
model: openai('gpt-4o'),
messages: modelMessages,
onChunk: ({ chunk }) => {
if (chunk.type !== 'text-delta' || chunk.text.length === 0) return;

// 2) Send a patch (transient: doesn't bloat message history)
writer.write({
type: 'data-mdxPatch',
data: {
patch: /* derived from chunk.text */ null as unknown as Patch,
},
transient: true,
});
},
onFinish: () => {
// 3) Optional final snapshot for recovery/persistence
writer.write({
type: 'data-mdxTree',
id: 'root',
data: { tree: /* final AST */ null as unknown as MdxTree },
});
},
});

writer.merge(result.toUIMessageStream());
},
});

return createUIMessageStreamResponse({ stream });
}
```

### React Native / Expo client example

In React Native you typically keep the “tree cache” in your own state/store and render from it. The key integration point is `onData`, and if you use Expo you can provide `expo/fetch` to the transport:

```tsx filename="app/(tabs)/index.tsx"
import { useChat } from '@ai-sdk/react';
import { DefaultChatTransport } from 'ai';
import { fetch as expoFetch } from 'expo/fetch';
import { useState } from 'react';

type MdxTree = unknown;
type Patch = unknown;

export default function ChatScreen() {
const [tree, setTree] = useState<MdxTree | null>(null);

useChat({
transport: new DefaultChatTransport({
api: 'https://your-server.example.com/api/chat',
fetch: expoFetch as unknown as typeof globalThis.fetch,
}),
onData: part => {
if (part.type === 'data-mdxTree') {
setTree(part.data.tree);
}

if (part.type === 'data-mdxPatch') {
// Apply patch to your local cache:
// setTree(prev => applyPatch(prev, part.data.patch))
}
},
});

// Render from `tree` (no markdown parsing required)
return null;
}
```

## Processing Data on the Client

### Using the onData Callback
Expand Down
7 changes: 7 additions & 0 deletions examples/next-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,3 +41,10 @@ To learn more about OpenAI, Next.js, and the AI SDK take a look at the following
- [Vercel AI Playground](https://ai-sdk.dev/playground)
- [OpenAI Documentation](https://platform.openai.com/docs) - learn about OpenAI features and API.
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.

## Included demos

This example app includes many routes under `app/` and `app/api/`. Two useful patterns related to streaming custom UI data:

- **Streaming data UI parts**: `app/use-chat-data-ui-parts` and `app/api/use-chat-data-ui-parts`
- **Streaming a JSON “document tree” + patches** (markdown/MDX-friendly): `app/use-chat-mdx-tree-patches` and `app/api/use-chat-mdx-tree-patches`
118 changes: 118 additions & 0 deletions examples/next-openai/app/api/use-chat-mdx-tree-patches/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,118 @@
import { openai } from '@ai-sdk/openai';
import {
convertToModelMessages,
createUIMessageStream,
createUIMessageStreamResponse,
type MdxTree,
type MdxTreePatch,
stepCountIs,
streamText,
} from 'ai';
import type { UIMessageChunk } from 'ai';

type MdxTreeData = { rootId: string; tree: MdxTree };
type MdxPatchData = { rootId: string; patch: MdxTreePatch };

const ROOT_ID = 'mdx-root';
const TEXT_VALUE_PATH = '/children/0/children/0/value';

function createInitialTree(): MdxTree {
return {
type: 'element',
name: 'root',
children: [
{
type: 'element',
name: 'p',
children: [{ type: 'text', value: '' }],
},
],
};
}

export async function POST(req: Request) {
const { messages } = (await req.json()) as { messages: unknown[] };
const modelMessages = await convertToModelMessages(messages);

const stream = createUIMessageStream({
execute: ({ writer }) => {
// Send an initial "document tree" snapshot.
writer.write({
type: 'data-mdxTree',
id: ROOT_ID,
data: {
rootId: ROOT_ID,
tree: createInitialTree(),
} satisfies MdxTreeData,
} satisfies UIMessageChunk<never, { mdxTree: MdxTreeData }>);

let fullText = '';

const result = streamText({
model: openai('gpt-4o'),
stopWhen: stepCountIs(1),
messages: modelMessages,
onChunk: ({ chunk }) => {
if (chunk.type !== 'text-delta' || chunk.text.length === 0) {
return;
}

fullText += chunk.text;

// Stream an incremental "tree patch" that the client can apply to its local tree.
// Marked as transient so it doesn't bloat message history, but still triggers onData().
writer.write({
type: 'data-mdxPatch',
data: {
rootId: ROOT_ID,
patch: {
op: 'append-text',
path: TEXT_VALUE_PATH,
text: chunk.text,
} satisfies MdxTreePatch,
} satisfies MdxPatchData,
transient: true,
} satisfies UIMessageChunk<never, { mdxPatch: MdxPatchData }>);
},
onFinish: () => {
// Send a final snapshot for recovery / persistence.
writer.write({
type: 'data-mdxTree',
id: ROOT_ID,
data: {
rootId: ROOT_ID,
tree: applyTextValue(createInitialTree(), fullText),
} satisfies MdxTreeData,
} satisfies UIMessageChunk<never, { mdxTree: MdxTreeData }>);
},
});

writer.merge(result.toUIMessageStream());
},
});

return createUIMessageStreamResponse({ stream });
}

function applyTextValue(tree: MdxTree, value: string): MdxTree {
// Demo helper: this route uses a fixed document shape with one text node.
// Consumers should use the exported `applyMdxTreePatch` helper with JSON pointers.
if (
tree.type !== 'element' ||
tree.name !== 'root' ||
tree.children?.[0]?.type !== 'element' ||
tree.children[0].children?.[0]?.type !== 'text'
) {
return tree;
}

return {
...tree,
children: [
{
...tree.children[0],
children: [{ type: 'text', value }],
},
],
};
}
122 changes: 122 additions & 0 deletions examples/next-openai/app/use-chat-mdx-tree-patches/page.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
'use client';

import ChatInput from '@/components/chat-input';
import { useChat } from '@ai-sdk/react';
import { renderMdxTree } from '@ai-sdk/react';
import {
applyMdxTreePatch,
DefaultChatTransport,
type FinishReason,
type MdxTree,
type MdxTreePatch,
UIMessage,
} from 'ai';
import { useMemo, useState } from 'react';

type MdxTreeData = { rootId: string; tree: MdxTree };
type MdxPatchData = { rootId: string; patch: MdxTreePatch };

type MyMessage = UIMessage<
never,
{
mdxTree: MdxTreeData;
mdxPatch: MdxPatchData;
}
>;

export default function Chat() {
const [lastFinishReason, setLastFinishReason] = useState<
FinishReason | undefined
>(undefined);
const [tree, setTree] = useState<MdxTree | null>(null);

const { error, status, sendMessage, messages, regenerate, stop } =
useChat<MyMessage>({
transport: new DefaultChatTransport({
api: '/api/use-chat-mdx-tree-patches',
}),
onData: dataPart => {
if (dataPart.type === 'data-mdxTree') {
setTree(dataPart.data.tree);
}

if (dataPart.type === 'data-mdxPatch') {
setTree(prev =>
prev != null ? applyMdxTreePatch(prev, dataPart.data.patch) : prev,
);
}
},
onFinish: ({ finishReason }) => {
setLastFinishReason(finishReason);
},
});

const rendered = useMemo(() => {
if (tree == null) return null;

// Web demo: uses tag names directly. In React Native, you would map 'p', 'strong', etc.
return renderMdxTree(tree);
}, [tree]);

return (
<div className="flex flex-col py-24 mx-auto w-full max-w-md stretch">
<div className="mb-6 p-3 border rounded">
<div className="font-medium">Rendered from streamed “MDX tree”</div>
<div className="whitespace-pre-wrap">{rendered}</div>
</div>

{messages.map(message => (
<div key={message.id} className="whitespace-pre-wrap">
{message.role === 'user' ? 'User: ' : 'AI: '}
{message.parts.map((part, index) => {
if (part.type === 'text') {
return <span key={index}>{part.text}</span>;
}
if (part.type === 'data-mdxTree') {
return (
<pre key={index} className="mt-2 text-xs opacity-70">
{JSON.stringify(part.data.tree, null, 2)}
</pre>
);
}
return null;
})}
</div>
))}

{(status === 'submitted' || status === 'streaming') && (
<div className="mt-4 text-gray-500">
{status === 'submitted' && <div>Loading...</div>}
<button
type="button"
className="px-4 py-2 mt-4 text-blue-500 rounded-md border border-blue-500"
onClick={stop}
>
Stop
</button>
</div>
)}

{error && (
<div className="mt-4">
<div className="text-red-500">An error occurred.</div>
<button
type="button"
className="px-4 py-2 mt-4 text-blue-500 rounded-md border border-blue-500"
onClick={() => regenerate()}
>
Retry
</button>
</div>
)}

{messages.length > 0 && (
<div className="mt-4 text-gray-500">
Finish reason: {String(lastFinishReason)}
</div>
)}

<ChatInput status={status} onSubmit={text => sendMessage({ text })} />
</div>
);
}
Loading