-
Notifications
You must be signed in to change notification settings - Fork 928
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fetch, ESM & streamCompletion util #45
Conversation
gfortaine
commented
Jan 4, 2023
•
edited
Loading
edited
- chore(upgrade): upgrade to TypeScript 4.9
- chore(upgrade): enable ESM support
- chore(upgrade): add streamCompletion helper function (credits to @schnerd & @rauschma)
- chore(upgrade): update README for fetch support
Related openai/openai-openapi#10 |
e8a3330
to
4ef8ed1
Compare
Here is the package 🎉 : https://www.npmjs.com/package/@fortaine/openai |
045daf7
to
d413829
Compare
79d4747
to
be8bdbc
Compare
75effed
to
85b4bea
Compare
Thank you for all the contributions – just wanted to let you know that we've seen this and will respond soon, things have just been a little busy and having trouble carving out time to fully evaluate all the changes that are included here. |
Thank you for making this PR! Very excited for it, as it will allow the client to be used on Edge compute platforms (e.g. Vercel Edge Functions). |
e9bd8e9
to
15e9aae
Compare
e598ce1
to
d3e0148
Compare
@schnerd @leerob @DennisKo this PR should be fully compatible with Vercel's Edge Runtime now 🚀 A few comments :
|
Really looking forward to this PR - it will enable the openai API to be used in background service workers for web extension! I feel like bundling isomorphic-fetch and http might bloat the bundle. The ideal setup for me would be to have the constructor allowing a custom axios instance to be passed down. |
This adds a number of dependencies that make this harder to run in other contexts, for example a CloudFlare Worker (which doesn't run in node). Just switching out axios for fetch would be great on its own. |
Any update here? Would love to see this merged soon! |
Patiently awaiting this ❤️ |
I have a lazy version of this which is automated: #!/usr/bin/env bash
touch .npmrc && echo "//registry.npmjs.org/:_authToken=\${NPM_TOKEN}" >> .npmrc
openai_version=$(npm view openai version)
ericlewis_version=$(npm view @ericlewis/openai version)
# Compare the versions and output the result
if [[ "$openai_version" == "$ericlewis_version" ]]; then
echo "Versions match, skipping."
else
npm install change-package-name typescript@4 --save-dev
npm uninstall axios
npm install redaxios
npx change-package-name @ericlewis/openai
sed -i "s/import type { AxiosPromise, AxiosInstance, AxiosRequestConfig } from 'axios';/type AxiosPromise<T = any> = Promise<{data: T, status: number, statusText: string, request?: any, headers: any, config: any}>;\ntype AxiosInstance = any;\ntype AxiosRequestConfig = any;\nimport globalAxios from 'redaxios';/g" *.ts
sed -i "s/import type { AxiosInstance, AxiosResponse } from 'axios';/type AxiosInstance = any;/g" *.ts
sed -i "s/<T = unknown, R = AxiosResponse<T>>//g" *.ts
sed -i "s/return axios.request<T, R>(axiosRequestArgs);/return axios.request(axiosRequestArgs);/g" *.ts
sed -i 's/"target": "es6",/"target": "es2021",\n "esModuleInterop": true,/g' tsconfig.json
sed -i '/import globalAxios from '\''axios'\'';/d' *.ts
npm run build
npm publish
fi |
@leerob @DennisKo @louisgv @danielrhodes @cfortuner @rogerahuntley @ericlewis cc @schnerd Here it is 🎉 : import fetchAdapter from "@haverstack/axios-fetch-adapter";
const configuration = new Configuration({
apiKey: process.env.OPENAI_API_KEY,
baseOptions: {
adapter: fetchAdapter
}
}); Various references : |
Note: It doesn't seem like |
Is there ETA for this ? How can I help moving this forward |
This is sorely needed. Can someone speak to what is needed / where help can look to assist? |
@s123121 @dustinlacewell in the meantime you can use openai-edge (supports streaming) |
It's not helpful for third-party libraries that use openai though. (Though it is very welcome, thanks for making it.) |
About to install and try out this. Are you saying it won't work for non-Next apps like Nuxt, etc? |
@dosstx did you try it? import { Configuration, OpenAIApi } from "openai-edge"
const configuration = new Configuration({
apiKey: "your-key-here"
})
const openai = new OpenAIApi(configuration, null, $fetch) |
@dan-kwiat Using Nuxt and setting up the code like so, I get empty response object back despite status 200: response from server: Response { 7:24:44 AM
[Symbol(realm)]: { settingsObject: {} },
[Symbol(state)]: {
aborted: false,
rangeRequested: false,
timingAllowPassed: false,
requestIncludesCredentials: false,
7:26:05 AM
ℹ hmr update /app.vue 7:26:05 AM
✔ Vite server hmr 6 files in 82.896ms 7:26:07 AM
server: Response { 7:26:12 AM
[Symbol(realm)]: { settingsObject: {} },
[Symbol(state)]: {
aborted: false,
rangeRequested: false,
timingAllowPassed: false,
[Symbol(headers map sorted)]: null
},
urlList: [],
body: { stream: undefined, source: null, length: null }
},
[Symbol(headers)]: HeadersList {
cookies: null,
[Symbol(headers map)]: Map(4) {
'access-control-allow-origin' => [Object],
'content-type' => [Object],
'cache-control' => [Object],
'x-accel-buffering' => [Object]
},
[Symbol(headers map sorted)]: null
}
}
import { Configuration, OpenAIApi } from "openai-edge"
const rConfig = useRuntimeConfig()
const configuration = new Configuration({
apiKey: rConfig.OPENAI_API_KEY,
})
const openai = new OpenAIApi(configuration)
export const config = {
runtime: "edge",
}
export default defineEventHandler(async (event) => {
try {
const completion = await openai.createChatCompletion({
model: "gpt-3.5-turbo",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "Who won the world series in 2020?" },
{
role: "assistant",
content: "The Los Angeles Dodgers won the World Series in 2020.",
},
{ role: "user", content: "Where was it played?" },
],
max_tokens: 7,
temperature: 0,
stream: true,
})
const response = new Response(completion.body, {
headers: {
"Access-Control-Allow-Origin": "*",
"Content-Type": "text/event-stream;charset=utf-8",
"Cache-Control": "no-cache, no-transform",
"X-Accel-Buffering": "no",
},
})
console.log('server: ', response)
return response
} catch (error: any) {
console.error(error)
return new Response(JSON.stringify(error), {
status: 400,
headers: {
"content-type": "application/json",
},
})
}
}) |
I was speaking of third-party packages that use openai as a dependency. No good way of forcing those packages to use openai-edge afaik. |
Thank you for putting this together! Great news; we have a new, fully-rewritten upcoming version v4.0.0 that supports ESM, uses fetch, and has conveniences for streaming (with more coming soon). Please give it a try and let us know what you think in the linked discussion! |