Skip to content

Commit

Permalink
chore (ui): split out ui packages (#1897)
Browse files Browse the repository at this point in the history
  • Loading branch information
lgrammel committed Jun 10, 2024
1 parent 2076816 commit 85f209a
Show file tree
Hide file tree
Showing 200 changed files with 2,318 additions and 1,287 deletions.
10 changes: 10 additions & 0 deletions .changeset/bright-squids-sleep.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
---
'@ai-sdk/ui-utils': patch
'@ai-sdk/svelte': patch
'@ai-sdk/react': patch
'@ai-sdk/solid': patch
'ai': patch
'@ai-sdk/vue': patch
---

chore: extracted ui library support into separate modules
5 changes: 5 additions & 0 deletions .changeset/metal-meals-swim.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

removed (streams): experimental_StreamingReactResponse was removed. Please use AI SDK RSC instead.
10 changes: 5 additions & 5 deletions content/docs/02-getting-started/02-nextjs-app-router.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -50,13 +50,13 @@ Install `ai` and `@ai-sdk/openai`, the Vercel AI package and Vercel AI SDK's [ O
<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm install ai @ai-sdk/openai zod" dark />
<Snippet text="pnpm install ai @ai-sdk/openai @ai-sdk/react zod" dark />
</Tab>
<Tab>
<Snippet text="npm install ai @ai-sdk/openai zod" dark />
<Snippet text="npm install ai @ai-sdk/openai @ai-sdk/react zod" dark />
</Tab>
<Tab>
<Snippet text="yarn add ai @ai-sdk/openai zod" dark />
<Snippet text="yarn add ai @ai-sdk/openai @ai-sdk/react zod" dark />
</Tab>
</Tabs>
</div>
Expand Down Expand Up @@ -125,7 +125,7 @@ Update your root page (`app/page.tsx`) with the following code to show a list of
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -223,7 +223,7 @@ To access this data on the frontend, the `useChat` hook returns an optional valu
```tsx filename="app/page.tsx" highlight="6, 9"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, data } = useChat();
Expand Down
10 changes: 5 additions & 5 deletions content/docs/02-getting-started/03-nextjs-pages-router.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -50,13 +50,13 @@ Install `ai` and `@ai-sdk/openai`, Vercel AI SDK's OpenAI provider.
<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm install ai @ai-sdk/openai zod" dark />
<Snippet text="pnpm install ai @ai-sdk/openai @ai-sdk/react zod" dark />
</Tab>
<Tab>
<Snippet text="npm install ai @ai-sdk/openai zod" dark />
<Snippet text="npm install ai @ai-sdk/openai @ai-sdk/react zod" dark />
</Tab>
<Tab>
<Snippet text="yarn add ai @ai-sdk/openai zod" dark />
<Snippet text="yarn add ai @ai-sdk/openai @ai-sdk/react zod" dark />
</Tab>
</Tabs>
</div>
Expand Down Expand Up @@ -129,7 +129,7 @@ Now that you have an API route that can query an LLM, it's time to setup your fr
Update your root page (`pages/index.tsx`) with the following code to show a list of chat messages and provide a user message input:

```tsx filename="pages/index.tsx"
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -220,7 +220,7 @@ In this code, you:
To access this data on the frontend, the `useChat` hook returns an optional value that stores this data. Update your root route with the following code to render the streamed data:

```tsx filename="pages/index.tsx" highlight="4, 7"
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, data } = useChat();
Expand Down
10 changes: 5 additions & 5 deletions content/docs/02-getting-started/04-svelte.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,13 @@ Install `ai` and `@ai-sdk/openai`, Vercel AI SDK's OpenAI provider.
<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm install ai @ai-sdk/openai zod" dark />
<Snippet text="pnpm install ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
</Tab>
<Tab>
<Snippet text="npm install ai @ai-sdk/openai zod" dark />
<Snippet text="npm install ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
</Tab>
<Tab>
<Snippet text="yarn add ai @ai-sdk/openai zod" dark />
<Snippet text="yarn add ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
</Tab>
</Tabs>
</div>
Expand Down Expand Up @@ -121,7 +121,7 @@ Update your root page (`src/routes/+page.svelte`) with the following code to sho

```svelte filename="src/routes/+page.svelte"
<script>
import { useChat } from 'ai/svelte';
import { useChat } from '@ai-sdk/svelte';
const { input, handleSubmit, messages } = useChat();
</script>
Expand Down Expand Up @@ -209,7 +209,7 @@ To access this data on the frontend, the `useChat` hook returns an optional valu

```svelte filename="src/routes/+page.svelte" highlight="4, 8"
<script>
import { useChat } from 'ai/svelte';
import { useChat } from '@ai-sdk/svelte';
const { input, handleSubmit, messages, data } = useChat();
</script>
Expand Down
10 changes: 5 additions & 5 deletions content/docs/02-getting-started/05-nuxt.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,13 @@ Install `ai` and `@ai-sdk/openai`, Vercel AI SDK's OpenAI provider.
<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm install ai @ai-sdk/openai zod" dark />
<Snippet text="pnpm install ai @ai-sdk/openai @ai-sdk/vue zod" dark />
</Tab>
<Tab>
<Snippet text="npm install ai @ai-sdk/openai zod" dark />
<Snippet text="npm install ai @ai-sdk/openai @ai-sdk/vue zod" dark />
</Tab>
<Tab>
<Snippet text="yarn add ai @ai-sdk/openai zod" dark />
<Snippet text="yarn add ai @ai-sdk/openai @ai-sdk/vue zod" dark />
</Tab>
</Tabs>
</div>
Expand Down Expand Up @@ -130,7 +130,7 @@ Update your root page (`pages/index.vue`) with the following code to show a list

```tsx filename="pages/index.vue"
<script setup lang="ts">
import { useChat } from 'ai/vue';
import { useChat } from '@ai-sdk/vue';

const { messages, input, handleSubmit } = useChat();
</script>
Expand Down Expand Up @@ -224,7 +224,7 @@ To access this data on the frontend, the `useChat` hook returns an optional valu

```tsx filename="pages/index.vue" highlight="4, 9"
<script setup lang="ts">
import { useChat } from 'ai/vue';
import { useChat } from '@ai-sdk/vue';

const { messages, input, handleSubmit, data } = useChat();
</script>
Expand Down
2 changes: 1 addition & 1 deletion content/docs/05-ai-sdk-ui/02-chatbot.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ In this guide, you will learn how to use the `useChat` hook to create a chatbot
```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
Expand Down
2 changes: 1 addition & 1 deletion content/docs/05-ai-sdk-ui/03-chatbot-with-tool-calling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ There are three things worth mentioning:
'use client';

import { ToolInvocation } from 'ai';
import { Message, useChat } from 'ai/react';
import { Message, useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, addToolResult } =
Expand Down
2 changes: 1 addition & 1 deletion content/docs/05-ai-sdk-ui/05-completion.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ In this guide, you will learn how to use the `useCompletion` hook in your applic
```tsx filename='app/page.tsx'
'use client';

import { useCompletion } from 'ai/react';
import { useCompletion } from '@ai-sdk/react';

export default function Page() {
const { completion, input, handleInputChange, handleSubmit } = useCompletion({
Expand Down
2 changes: 1 addition & 1 deletion content/docs/05-ai-sdk-ui/10-openai-assistants.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ The `useAssistant` hook is currently supported with `ai/react` and `ai/svelte`.
```tsx filename='app/page.tsx'
'use client';

import { Message, useAssistant } from 'ai/react';
import { Message, useAssistant } from '@ai-sdk/react';

export default function Chat() {
const { status, messages, input, submitMessage, handleInputChange } =
Expand Down
20 changes: 16 additions & 4 deletions content/docs/07-reference/ai-sdk-ui/01-use-chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,16 +11,28 @@ Allows you to easily create a conversational user interface for your chatbot app

<Tabs items={['React', 'Svelte', 'Vue', 'Solid']}>
<Tab>
<Snippet text="import { useChat } from 'ai/react'" dark prompt={false} />
<Snippet
text="import { useChat } from '@ai-sdk/react'"
dark
prompt={false}
/>
</Tab>
<Tab>
<Snippet text="import { useChat } from 'ai/svelte'" dark prompt={false} />
<Snippet
text="import { useChat } from '@ai-sdk/svelte'"
dark
prompt={false}
/>
</Tab>
<Tab>
<Snippet text="import { useChat } from 'ai/vue'" dark prompt={false} />
<Snippet text="import { useChat } from '@ai-sdk/vue'" dark prompt={false} />
</Tab>
<Tab>
<Snippet text="import { useChat } from 'ai/solid'" dark prompt={false} />
<Snippet
text="import { useChat } from '@ai-sdk/solid'"
dark
prompt={false}
/>
</Tab>
</Tabs>

Expand Down
8 changes: 4 additions & 4 deletions content/docs/07-reference/ai-sdk-ui/02-use-completion.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,28 +12,28 @@ Allows you to create text completion based capibilities for your application. It
<Tabs items={['React', 'Svelte', 'Vue', 'Solid']}>
<Tab>
<Snippet
text="import { useCompletion } from 'ai/react'"
text="import { useCompletion } from '@ai-sdk/react'"
dark
prompt={false}
/>
</Tab>
<Tab>
<Snippet
text="import { useCompletion } from 'ai/svelte'"
text="import { useCompletion } from '@ai-sdk/svelte'"
dark
prompt={false}
/>
</Tab>
<Tab>
<Snippet
text="import { useCompletion } from 'ai/vue'"
text="import { useCompletion } from '@ai-sdk/vue'"
dark
prompt={false}
/>
</Tab>
<Tab>
<Snippet
text="import { useCompletion } from 'ai/solid'"
text="import { useCompletion } from '@ai-sdk/solid'"
dark
prompt={false}
/>
Expand Down
4 changes: 2 additions & 2 deletions content/docs/07-reference/ai-sdk-ui/03-use-assistant.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,14 +20,14 @@ This works in conjunction with [`AssistantResponse`](./assistant-response) in th
<Tabs items={['React', 'Svelte']}>
<Tab>
<Snippet
text="import { useAssistant } from 'ai/react'"
text="import { useAssistant } from '@ai-sdk/react'"
dark
prompt={false}
/>
</Tab>
<Tab>
<Snippet
text="import { useAssistant } from 'ai/svelte'"
text="import { useAssistant } from '@ai-sdk/svelte'"
dark
prompt={false}
/>
Expand Down

This file was deleted.

5 changes: 0 additions & 5 deletions content/docs/07-reference/stream-helpers/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,6 @@ description: Learn to use help functions that help stream generations from diffe
description: 'Create a streaming response for text generations.',
href: '/docs/reference/stream-helpers/streaming-text-response',
},
{
title: 'StreamingReactResponse',
description: 'stream React responses in a server action environment.',
href: '/docs/reference/stream-helpers/streaming-react-response',
},
{
title: 'streamtoResponse',
description: 'Pipe a ReadableStream to a Node.js ServerResponse object.',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ You can replace the `imageUrl` with the actual URL of the image you want to send
```typescript filename='app/page.tsx' highlight="18-20"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

// Force the page to be dynamic and allow streaming responses up to 30 seconds
export const dynamic = 'force-dynamic';
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Text generation can sometimes take a long time to complete, especially when you'
Let's create a simple React component that imports the `useCompletion` hook from the `ai/react` module. The `useCompletion` hook will call the `/api/completion` endpoint when a button is clicked. The endpoint will generate text based on the input prompt and stream it to the client.

```tsx filename="pages/index.tsx"
import { useCompletion } from 'ai/react';
import { useCompletion } from '@ai-sdk/react';

export default function Page() {
const { completion, complete } = useCompletion({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Chat completion can sometimes take a long time to finish, especially when the re
Let's create a React component that imports the `useChat` hook from the `ai/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client.

```tsx filename='pages/index.tsx'
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat();
Expand Down
2 changes: 1 addition & 1 deletion content/examples/02-next-pages/10-tools/01-call-tool.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Let's create a React component that imports the `useChat` hook from the `ai/reac
We will use the `maxToolRoundtrips` to specify the maximum number of consecutive tool calls that can made before the model or the user responds with a text message. In this example, you will set it to `1` to allow for one round trip to happen.

```tsx filename='pages/index.tsx'
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Let's create a React component that imports the `useChat` hook from the `ai/reac
You will use the `maxToolRoundtrips` to specify the maximum number of consecutive tool calls that can made before the model or the user responds with a text message. In this example, you will set it to `1` to allow for one round trip to happen.

```tsx filename='pages/index.tsx'
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ Let's build an assistant that gets the weather for any city by calling the `getW

```tsx filename='pages/index.tsx'
import { ToolInvocation } from 'ai';
import { Message, useChat } from 'ai/react';
import { Message, useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, addToolResult } =
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ description: Learn to stream assistant responses using the Vercel AI SDK in your
Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.

```tsx filename='pages/index.tsx'
import { Message, useAssistant } from 'ai/react';
import { Message, useAssistant } from '@ai-sdk/react';

export default function Page() {
const { status, messages, input, submitMessage, handleInputChange } =
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ You will need to provide the list of tools on the OpenAI [Assistant Dashboard](h
Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.

```tsx filename='pages/index.tsx'
import { Message, useAssistant } from 'ai/react';
import { Message, useAssistant } from '@ai-sdk/react';

export default function Page() {
const { status, messages, input, submitMessage, handleInputChange } =
Expand Down
2 changes: 1 addition & 1 deletion content/providers/04-adapters/langchain.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Then, we use the Vercel AI SDK's [`useCompletion`](/docs/ai-sdk-ui/completion) m
```tsx filename="app/page.tsx"
'use client';

import { useCompletion } from 'ai/react';
import { useCompletion } from '@ai-sdk/react';

export default function Chat() {
const { completion, input, handleInputChange, handleSubmit } =
Expand Down
Loading

0 comments on commit 85f209a

Please sign in to comment.