OpenAI for Next.js
The OpenAI for Next.js project offers a remarkable toolkit that simplifies the integration of OpenAI streams into Next.js applications. Aimed at developers who want to access the power of OpenAI within their web applications, this project provides essential hooks and components that handle the complexities of streaming data efficiently.
Installation
To get started with the OpenAI for Next.js project, one needs to ensure that the necessary packages are installed. These are nextjs-openai
for the frontend functionalities and openai-streams
for API route management. Developers can install these packages using either yarn
or npm
:
yarn add nextjs-openai openai-streams
# -or-
npm i --save nextjs-openai openai-streams
Hooks
The project offers powerful hooks like useBuffer()
and useTextBuffer()
. These hooks are designed to help developers easily manage an incrementing buffer of data or text from a specified URL. For example, useTextBuffer()
can fetch streaming text data from an API and manage it within a component. Developers can further manipulate the data by using functions like refresh
, cancel
, and manage completion status through done
.
import { useTextBuffer } from "nextjs-openai";
export default function Demo() {
const { buffer, refresh, cancel, done } = useTextBuffer({
url: "/api/demo"
});
return (
<div>
<StreamingText buffer={buffer} fade={600} />
<button onClick={refresh} disabled={!done}>Refresh</button>
<button onClick={cancel} disabled={done}>Cancel</button>
</div>
);
}
Components
To display the streaming text with finesse, the components <StreamingText>
and <StreamingTextURL>
are provided. These components render text using smooth fade-in animations, allowing streamed data to be displayed with visually appealing effects.
import { StreamingTextURL } from "nextjs-openai";
export default function Demo() {
return (
<StreamingTextURL
url="/api/demo"
fade={600}
throttle={100}
/>
);
}
Sending Data and Advanced Usage
For advanced use-cases, developers might need to modify the type of network requests being sent. This can be customized by setting parameters like { method, data }
. By default, data is sent as a POST request, but this can be changed to a GET request by altering the method
parameter and adjusting the URL appropriately.
Usage with <StreamingTextURL>
:
import { StreamingTextURL } from "nextjs-openai";
export default function Home() {
const [data, setData] = useState({ name: "John" });
// ...
return (
<StreamingTextURL url="/api/demo" data={data}>
);
}
Usage with useTextBuffer()
:
import { useTextBuffer, StreamingText } from "nextjs-openai";
export default function Home() {
const [data, setData] = useState({ name: "John" });
const { buffer, refresh, cancel } = useTextBuffer({
url: "/api/demo",
throttle: 100,
data,
options: {
headers: {
// ...
}
}
});
// ...
return (
<StreamingText buffer={buffer}>
);
}
API Routes
The project also includes tools for handling OpenAI streams within API routes.
Edge Runtime:
For those using the Edge runtime, the following example demonstrates creating an OpenAI stream using the OpenAI
function:
import { OpenAI } from "openai-streams";
export default async function handler() {
const stream = await OpenAI(
"completions",
{
model: "text-davinci-003",
prompt: "Write a happy sentence.\n\n",
max_tokens: 25
},
);
return new Response(stream);
}
export const config = {
runtime: "edge"
};
Node Environment (<18):
For environments not supporting Edge runtime, a NodeJS-compatible version can be utilized as shown below:
import type { NextApiRequest, NextApiResponse } from "next";
import { OpenAI } from "openai-streams/node";
export default async function test (_: NextApiRequest, res: NextApiResponse) {
const stream = await OpenAI(
"completions",
{
model: "text-davinci-003",
prompt: "Write a happy sentence.\n\n",
max_tokens: 25
}
);
stream.pipe(res);
}
Through these tools and examples, OpenAI for Next.js makes it immensely easier to leverage AI-driven streaming capabilities within web applications, shortening development time and enhancing functionality.