This library returns OpenAI API responses as streams only. Non-stream endpoints
like edits
etc. are simply a stream with only one chunk update.
It simplifies the following:
- Prioritizing streaming and type inference.
- Auto-loads
OPENAI_API_KEY
fromprocess.env
. - Uses the same function for all endpoints, and switches the type based on the
OpenAI(endpoint, ...)
signature.
Overall, the library aims to make it as simple to call the API as possible and stream updates in.
-
Set the
OPENAI_API_KEY
env variable.The runtime will throw if this is not available.
-
Call the API via
await OpenAI(endpoint, params)
.The
params
type will be inferred based on theendpoint
you provide, i.e. for the"edits"
endpoint,import('openai').CreateEditRequest
will be enforced.
export default async function test() {
const stream = await OpenAI(
"completions",
{
model: "text-davinci-003",
prompt: "Write a two-sentence paragraph.\n\n",
temperature: 1,
max_tokens: 100,
},
);
return new Response(completionsStream);
}
- Internally, streams are often manipulated using generators via
for await (const chunk of yieldStream(stream)) { ... }
. We recommend following this pattern if you find it intuitive.