-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Error] Too many requests #2
Comments
Sounds like there has been some change in OpenAI API, since previously there was no issue. I'll check what the rate limits are currently, should be easy to fix |
Good, will be waiting .. and thanks for your effort. |
Not sure if these have changed since the OpenAI dev day recently (I think they mentioned increasing the rate limits from memory?)
Ideally the code would know how to identify when a rate limit was hit, and to provide some feedback/recovery option to the user (eg. slowdown/wait for limit period to reset/etc) |
https://platform.openai.com/docs/guides/rate-limits?context=tier-free I was first thinking about throttling the requests to an allowed level, but I think it makes more sense to handle the rate limit error as it occurs, as there may be other users at the organization level exhausting the rate limit at the same time 🤔 |
@jehna Yeah, that sounds like a good approach to me. I'm not sure, but do the standard API responses return any details about how much of the rate limit is 'remaining' in their headers/etc? Or is it opaque until you hit an error response? Edit: Sounds like they do:
Also have notes on error mitigation: Including exponential backoff: And request batching: |
@jehna @0xdevalias @cairocoder i'm the maintainer of LiteLLM we allow you to maximize your throughput - load balance between multiple deployments (Azure, OpenAI) Here's how to use it from litellm import Router
model_list = [{ # list of model deployments
"model_name": "gpt-3.5-turbo", # model alias
"litellm_params": { # params for litellm completion/embedding call
"model": "azure/chatgpt-v-2", # actual model name
"api_key": os.getenv("AZURE_API_KEY"),
"api_version": os.getenv("AZURE_API_VERSION"),
"api_base": os.getenv("AZURE_API_BASE")
}
}, {
"model_name": "gpt-3.5-turbo",
"litellm_params": { # params for litellm completion/embedding call
"model": "azure/chatgpt-functioncalling",
"api_key": os.getenv("AZURE_API_KEY"),
"api_version": os.getenv("AZURE_API_VERSION"),
"api_base": os.getenv("AZURE_API_BASE")
}
}, {
"model_name": "gpt-3.5-turbo",
"litellm_params": { # params for litellm completion/embedding call
"model": "vllm/TheBloke/Marcoroni-70B-v1-AWQ",
"api_key": os.getenv("OPENAI_API_KEY"),
}
}]
router = Router(model_list=model_list)
# openai.ChatCompletion.create replacement
response = router.completion(model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hey, how's it going?"}])
print(response) |
This is still a major limitation that requires a mitigation
|
This would seem to be the section of code making the API call (at least for the OpenAI plugin): humanify/src/plugins/openai/openai-rename.ts Lines 6 to 38 in 64a1b95
You would presumably need to add some logic to catch/detect/recover from this error around the Ideally perhaps this pattern could be lifted into a more generic catch/retry type handler so that it could be reused across the other plugins as well. |
I'm currently trying to use Gemini since I don't have OpenAI API, so it would be great if the solution can be adapted for other plugins too. I appreciate your continued support with my requests! |
Related issues: |
|
I am getting "Too many requests" error:
humanify/node_modules/axios/lib/core/createError.js:16 var error = new Error(message); ^ Error: Request failed with status code 429 at createError (/Users/cairocoder/Sites/localhost/humanify/node_modules/axios/lib/core/createError.js:16:15) at settle (/Users/cairocoder/Sites/localhost/humanify/node_modules/axios/lib/core/settle.js:17:12) at IncomingMessage.handleStreamEnd (/Users/cairocoder/Sites/localhost/humanify/node_modules/axios/lib/adapters/http.js:322:11) at IncomingMessage.emit (node:events:526:35) at IncomingMessage.emit (node:domain:489:12) at endReadableNT (node:internal/streams/readable:1359:12) at processTicksAndRejections (node:internal/process/task_queues:82:21) { config: { transitional: { silentJSONParsing: true, forcedJSONParsing: true, clarifyTimeoutError: false }, adapter: [Function: httpAdapter], transformRequest: [ [Function: transformRequest] ], transformResponse: [ [Function: transformResponse] ], timeout: 0, xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: -1, maxBodyLength: -1, validateStatus: [Function: validateStatus], headers: { Accept: 'application/json, text/plain, */*', 'Content-Type': 'application/json', 'User-Agent': 'OpenAI/NodeJS/3.3.0', Authorization: 'Bearer xxx', 'Content-Length': 7364 }, method: 'post', data:
{"model":"gpt-3.5-turbo-16k","functions":[{"name":"rename_variables_and_functions","description":"Rename variables and function names in Javascript code","parameters":{"type":"object","properties":{"variablesAndFunctionsToRename":{"type":"array","items":{"type":"object","properties":{"name":{"type":"string","description":"The name of the variable or function name to rename"},"newName":{"type":"string","description":"The new name of the variable or function name"}},"required":["name","newName"]}}},"required":["variablesToRename"]}}],"messages":[{"role":"assistant","content":"Rename all Javascript variables and functions to have descriptive names based on their usage in the code."},{"role":"user","content":" a = a.join(\" \");\n }\n if (Ud(b)) {\n b = b.join(\" \");\n }\n return a + \" \" + b;\n } else {\n return a;\n }\n } else {\n return b;\n }\n } else {\n return \"\";\n }\n }\n function eb(a) {\n for (var b = 0; b < a.length; b++) {\n var c = a[b];\n if (c.nodeType === Je) {\n return c;\n }\n }\n }\n function fb(a) {\n if (v(a)) {\n a = a.split(\" \");\n }\n var b = ma();\n d(a, function (a) {\n if (a.length) {\n b[a] = true;\n }\n });\n return b;\n }\n function gb(a) {\n if (t(a)) {\n return a;\n } else {\n return {};\n }\n }\n function hb(a, b, c, e) {\n function f(a) {\n try {\n a.apply(null, Q(arguments, 1));\n } finally {\n s--;\n if (s === 0) {\n for (; t.length;) {\n try {\n t.pop()();\n } catch (b) {\n c.error(b);\n }\n }\n }\n }\n }\n function g(a) {\n var b = a.indexOf(\"#\");\n if (b === -1) {\n return \"\";\n } else {\n return a.substr(b);\n }\n }\n function h() {\n y = null;\n i();\n j();\n }\n function i() {\n u = z();\n if (r(u)) {\n u = null;\n } else {\n u = u;\n }\n if (O(u, C)) {\n u = C;\n }\n C = u;\n }\n function j() {\n if (w !== k.url() || v !== u) {\n w = k.url();\n v = u;\n d(A, function (a) {\n a(k.url(), u);\n });\n }\n }\n var k = this;\n var l = a.location;\n var m = a.history;\n var o = a.setTimeout;\n var p = a.clearTimeout;\n var q = {};\n k.isMock = false;\n var s = 0;\n var t = [];\n k.$$completeOutstandingRequest = f;\n k.$$incOutstandingRequestCount = function () {\n s++;\n };\n k.notifyWhenNoOutstandingRequests = function (a) {\n if (s === 0) {\n a();\n } else {\n t.push(a);\n }\n };\n var u;\n var v;\n var w = l.href;\n var x = b.find(\"base\");\n var y = null;\n var z = e.history ? function () {\n try {\n return m.state;\n } catch (a) {}\n } : n;\n i();\n v = u;\n k.url = function (b, c, d) {\n if (r(d)) {\n d = null;\n }\n if (l !== a.location) {\n l = a.location;\n }\n if (m !== a.history) {\n m = a.history;\n }\n if (b) {\n var f = v === d;\n if (w === b && (!e.history || f)) {\n return k;\n }\n var h = w && Pb(w) === Pb(b);\n w = b;\n v = d;\n if (!e.history || h && f) {\n if (!h) {\n y = b;\n }\n if (c) {\n l.replace(b);\n } else if (h) {\n l.hash = g(b);\n } else {\n l.href = b;\n }\n if (l.href !== b) {\n y = b;\n }\n } else {\n m[c ? \"replaceState\" : \"pushState\"](d, \"\", b);\n i();\n v = u;\n }\n if (y) {\n y = b;\n }\n return k;\n }\n return y || l.href.replace(/%27/g, \"'\");\n };\n k.state = function () {\n return u;\n };\n var A = [];\n var B = false;\n var C = null;\n k.onUrlChange = function (b) {\n if (!B) {\n if (e.history) {\n Jd(a).on(\"popstate\", h);\n }\n Jd(a).on(\"hashchange\", h);\n B = true;\n }\n A.push(b);\n return b;\n };\n k.$$applicationDestroyed = function () {\n Jd(a).off(\"hashchange popstate\", h);\n };\n k.$$checkUrlChange = j;\n k.baseHref = function () {\n var a = x.attr(\"href\");\n if (a) {\n return a.replace(/^(https?\\:)?\\/\\/[^\\\\/]/, \"\");\n } else {\n return \"\";\n }\n };\n k.defer = function (a, b) {\n var c;\n s++;\n c = o(function () {\n delete q[c];\n f(a);\n }, b || 0);\n q[c] = true;\n return c;\n };\n k.defer.cancel = function (a) {\n if (q[a]) {\n delete q[a];\n p(a);\n f(n);\n return true;\n } else {\n return false;\n }\n };\n }\n function ib() {\n this.$get = [\"$window\", \"$log\", \"$sniffer\", \"$document\", function (a, b, c, d) {\n return new hb(a, d, b, c);\n }];\n }\n function jb() {\n this.$get = function () {\n function a(a, d) {\n function e(a) {\n if (a != m) {\n if (n) {\n if (n == a) {\n n = a.n;\n }\n } else {\n n = a;\n }\n f(a.n, a.p);\n f(a, m);\n m = a;\n m.n = null;\n }\n }\n function f(a, b) {\n if (a != b) {\n if (a) {\n a.p = b;\n }\n if (b) {\n b.n = a;\n }\n }\n }\n if (a in c) {\n throw b(\"$cacheFactory\")(\"iid\", \"CacheId '{0}' is already taken!\", a);\n }\n var g = 0;\n var h = j({}, d, {\n id: a\n });\n var i = ma();\n var k = d && d.capacity || Number.MAX_VALUE;\n var l = ma();\n var m = null;\n var n = null;\n return c[a] = {\n put(a, b) {\n if (!r(b)) {\n if (k < Number.MAX_VALUE) {\n var c = l[a] || (l[a] = {\n key: a\n });\n e(c);\n }\n if (!(a in i)) {\n g++;\n }\n i[a] = b;\n if (g > k) {\n this.remove(n.key);\n }\n return b;\n }\n },\n get(a) {\n if (k < Number.MAX_VALUE) {\n var b = l[a];\n if (!b"}]}, url: 'https://api.openai.com/v1/chat/completions' }, request: <ref *1> ClientRequest { _events: [Object: null prototype] { abort: [Function (anonymous)], aborted: [Function (anonymous)], connect: [Function (anonymous)], error: [Function (anonymous)], socket: [Function (anonymous)], timeout: [Function (anonymous)], finish: [Function: requestOnFinish] }, _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 7364, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: TLSSocket { _tlsOptions: [Object], _secureEstablished: true, _securePending: false, _newSessionPending: false, _controlReleased: true, secureConnecting: false, _SNICallback: null, servername: 'api.openai.com', alpnProtocol: false, authorized: true, authorizationError: null, encrypted: true, _events: [Object: null prototype], _eventsCount: 10, connecting: false, _hadError: false, _parent: null, _host: 'api.openai.com', _closeAfterHandlingError: false, _readableState: [ReadableState], _maxListeners: undefined, _writableState: [WritableState], allowHalfOpen: false, _sockname: null, _pendingData: null, _pendingEncoding: '', server: undefined, _server: null, ssl: [TLSWrap], _requestCert: true, _rejectUnauthorized: true, parser: null, _httpMessage: [Circular *1], [Symbol(res)]: [TLSWrap], [Symbol(verified)]: true, [Symbol(pendingSession)]: null, [Symbol(async_id_symbol)]: 154, [Symbol(kHandle)]: [TLSWrap], [Symbol(lastWriteQueueSize)]: 0, [Symbol(timeout)]: null, [Symbol(kBuffer)]: null, [Symbol(kBufferCb)]: null, [Symbol(kBufferGen)]: null, [Symbol(kCapture)]: false, [Symbol(kSetNoDelay)]: false, [Symbol(kSetKeepAlive)]: true, [Symbol(kSetKeepAliveInitialDelay)]: 60, [Symbol(kBytesRead)]: 0, [Symbol(kBytesWritten)]: 0, [Symbol(connect-options)]: [Object] }, _header: 'POST /v1/chat/completions HTTP/1.1\r\n' + 'Accept: application/json, text/plain, */*\r\n' + 'Content-Type: application/json\r\n' + 'User-Agent: OpenAI/NodeJS/3.3.0\r\n' + 'Authorization: Bearer xxx\r\n' + 'Content-Length: 7364\r\n' + 'Host: api.openai.com\r\n' + 'Connection: close\r\n' + '\r\n', _keepAliveTimeout: 0, _onPendingData: [Function: nop], agent: Agent { _events: [Object: null prototype], _eventsCount: 2, _maxListeners: undefined, defaultPort: 443, protocol: 'https:', options: [Object: null prototype], requests: [Object: null prototype] {}, sockets: [Object: null prototype], freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: false, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo', maxTotalSockets: Infinity, totalSocketCount: 10, maxCachedSessions: 100, _sessionCache: [Object], [Symbol(kCapture)]: false }, socketPath: undefined, method: 'POST', maxHeaderSize: undefined, insecureHTTPParser: undefined, joinDuplicateHeaders: undefined, path: '/v1/chat/completions', _ended: true, res: IncomingMessage { _readableState: [ReadableState], _events: [Object: null prototype], _eventsCount: 4, _maxListeners: undefined, socket: [TLSSocket], httpVersionMajor: 1, httpVersionMinor: 1, httpVersion: '1.1', complete: true, rawHeaders: [Array], rawTrailers: [], joinDuplicateHeaders: undefined, aborted: false, upgrade: false, url: '', method: null, statusCode: 429, statusMessage: 'Too Many Requests', client: [TLSSocket], _consuming: false, _dumped: false, req: [Circular *1], responseUrl: 'https://api.openai.com/v1/chat/completions', redirects: [], [Symbol(kCapture)]: false, [Symbol(kHeaders)]: [Object], [Symbol(kHeadersCount)]: 34, [Symbol(kTrailers)]: null, [Symbol(kTrailersCount)]: 0 }, aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, host: 'api.openai.com', protocol: 'https:', _redirectable: Writable { _writableState: [WritableState], _events: [Object: null prototype], _eventsCount: 3, _maxListeners: undefined, _options: [Object], _ended: true, _ending: true, _redirectCount: 0, _redirects: [], _requestBodyLength: 7364, _requestBodyBuffers: [], _onNativeResponse: [Function (anonymous)], _currentRequest: [Circular *1], _currentUrl: 'https://api.openai.com/v1/chat/completions', [Symbol(kCapture)]: false }, [Symbol(kCapture)]: false, [Symbol(kBytesWritten)]: 0, [Symbol(kNeedDrain)]: false, [Symbol(corked)]: 0, [Symbol(kOutHeaders)]: [Object: null prototype] { accept: [Array], 'content-type': [Array], 'user-agent': [Array], authorization: [Array], 'content-length': [Array], host: [Array] }, [Symbol(errored)]: null, [Symbol(kHighWaterMark)]: 16384, [Symbol(kRejectNonStandardBodyWrites)]: false, [Symbol(kUniqueHeaders)]: null }, response: { status: 429, statusText: 'Too Many Requests', headers: { date: 'Mon, 06 Nov 2023 11:32:51 GMT', 'content-type': 'application/json; charset=utf-8', 'content-length': '477', connection: 'close', vary: 'Origin', 'x-ratelimit-limit-requests': '200', 'x-ratelimit-limit-tokens': '40000', 'x-ratelimit-remaining-requests': '196', 'x-ratelimit-remaining-tokens': '33175', 'x-ratelimit-reset-requests': '28m47.985s', 'x-ratelimit-reset-tokens': '10.236s', 'x-request-id': 'b139cbb15fd19834582fe3b584bfdca9', 'strict-transport-security': 'max-age=15724800; includeSubDomains', 'cf-cache-status': 'DYNAMIC', server: 'cloudflare', 'cf-ray': '821d04aacda10f75-EWR', 'alt-svc': 'h3=":443"; ma=86400' }, config: { transitional: [Object], adapter: [Function: httpAdapter], transformRequest: [Array], transformResponse: [Array], timeout: 0, xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: -1, maxBodyLength: -1, validateStatus: [Function: validateStatus], headers: [Object], method: 'post', data:
{"model":"gpt-3.5-turbo-16k","functions":[{"name":"rename_variables_and_functions","description":"Rename variables and function names in Javascript code","parameters":{"type":"object","properties":{"variablesAndFunctionsToRename":{"type":"array","items":{"type":"object","properties":{"name":{"type":"string","description":"The name of the variable or function name to rename"},"newName":{"type":"string","description":"The new name of the variable or function name"}},"required":["name","newName"]}}},"required":["variablesToRename"]}}],"messages":[{"role":"assistant","content":"Rename all Javascript variables and functions to have descriptive names based on their usage in the code."},{"role":"user","content":" a = a.join(\" \");\n }\n if (Ud(b)) {\n b = b.join(\" \");\n }\n return a + \" \" + b;\n } else {\n return a;\n }\n } else {\n return b;\n }\n } else {\n return \"\";\n }\n }\n function eb(a) {\n for (var b = 0; b < a.length; b++) {\n var c = a[b];\n if (c.nodeType === Je) {\n return c;\n }\n }\n }\n function fb(a) {\n if (v(a)) {\n a = a.split(\" \");\n }\n var b = ma();\n d(a, function (a) {\n if (a.length) {\n b[a] = true;\n }\n });\n return b;\n }\n function gb(a) {\n if (t(a)) {\n return a;\n } else {\n return {};\n }\n }\n function hb(a, b, c, e) {\n function f(a) {\n try {\n a.apply(null, Q(arguments, 1));\n } finally {\n s--;\n if (s === 0) {\n for (; t.length;) {\n try {\n t.pop()();\n } catch (b) {\n c.error(b);\n }\n }\n }\n }\n }\n function g(a) {\n var b = a.indexOf(\"#\");\n if (b === -1) {\n return \"\";\n } else {\n return a.substr(b);\n }\n }\n function h() {\n y = null;\n i();\n j();\n }\n function i() {\n u = z();\n if (r(u)) {\n u = null;\n } else {\n u = u;\n }\n if (O(u, C)) {\n u = C;\n }\n C = u;\n }\n function j() {\n if (w !== k.url() || v !== u) {\n w = k.url();\n v = u;\n d(A, function (a) {\n a(k.url(), u);\n });\n }\n }\n var k = this;\n var l = a.location;\n var m = a.history;\n var o = a.setTimeout;\n var p = a.clearTimeout;\n var q = {};\n k.isMock = false;\n var s = 0;\n var t = [];\n k.$$completeOutstandingRequest = f;\n k.$$incOutstandingRequestCount = function () {\n s++;\n };\n k.notifyWhenNoOutstandingRequests = function (a) {\n if (s === 0) {\n a();\n } else {\n t.push(a);\n }\n };\n var u;\n var v;\n var w = l.href;\n var x = b.find(\"base\");\n var y = null;\n var z = e.history ? function () {\n try {\n return m.state;\n } catch (a) {}\n } : n;\n i();\n v = u;\n k.url = function (b, c, d) {\n if (r(d)) {\n d = null;\n }\n if (l !== a.location) {\n l = a.location;\n }\n if (m !== a.history) {\n m = a.history;\n }\n if (b) {\n var f = v === d;\n if (w === b && (!e.history || f)) {\n return k;\n }\n var h = w && Pb(w) === Pb(b);\n w = b;\n v = d;\n if (!e.history || h && f) {\n if (!h) {\n y = b;\n }\n if (c) {\n l.replace(b);\n } else if (h) {\n l.hash = g(b);\n } else {\n l.href = b;\n }\n if (l.href !== b) {\n y = b;\n }\n } else {\n m[c ? \"replaceState\" : \"pushState\"](d, \"\", b);\n i();\n v = u;\n }\n if (y) {\n y = b;\n }\n return k;\n }\n return y || l.href.replace(/%27/g, \"'\");\n };\n k.state = function () {\n return u;\n };\n var A = [];\n var B = false;\n var C = null;\n k.onUrlChange = function (b) {\n if (!B) {\n if (e.history) {\n Jd(a).on(\"popstate\", h);\n }\n Jd(a).on(\"hashchange\", h);\n B = true;\n }\n A.push(b);\n return b;\n };\n k.$$applicationDestroyed = function () {\n Jd(a).off(\"hashchange popstate\", h);\n };\n k.$$checkUrlChange = j;\n k.baseHref = function () {\n var a = x.attr(\"href\");\n if (a) {\n return a.replace(/^(https?\\:)?\\/\\/[^\\\\/]/, \"\");\n } else {\n return \"\";\n }\n };\n k.defer = function (a, b) {\n var c;\n s++;\n c = o(function () {\n delete q[c];\n f(a);\n }, b || 0);\n q[c] = true;\n return c;\n };\n k.defer.cancel = function (a) {\n if (q[a]) {\n delete q[a];\n p(a);\n f(n);\n return true;\n } else {\n return false;\n }\n };\n }\n function ib() {\n this.$get = [\"$window\", \"$log\", \"$sniffer\", \"$document\", function (a, b, c, d) {\n return new hb(a, d, b, c);\n }];\n }\n function jb() {\n this.$get = function () {\n function a(a, d) {\n function e(a) {\n if (a != m) {\n if (n) {\n if (n == a) {\n n = a.n;\n }\n } else {\n n = a;\n }\n f(a.n, a.p);\n f(a, m);\n m = a;\n m.n = null;\n }\n }\n function f(a, b) {\n if (a != b) {\n if (a) {\n a.p = b;\n }\n if (b) {\n b.n = a;\n }\n }\n }\n if (a in c) {\n throw b(\"$cacheFactory\")(\"iid\", \"CacheId '{0}' is already taken!\", a);\n }\n var g = 0;\n var h = j({}, d, {\n id: a\n });\n var i = ma();\n var k = d && d.capacity || Number.MAX_VALUE;\n var l = ma();\n var m = null;\n var n = null;\n return c[a] = {\n put(a, b) {\n if (!r(b)) {\n if (k < Number.MAX_VALUE) {\n var c = l[a] || (l[a] = {\n key: a\n });\n e(c);\n }\n if (!(a in i)) {\n g++;\n }\n i[a] = b;\n if (g > k) {\n this.remove(n.key);\n }\n return b;\n }\n },\n get(a) {\n if (k < Number.MAX_VALUE) {\n var b = l[a];\n if (!b"}]}, url: 'https://api.openai.com/v1/chat/completions' }, request: <ref *1> ClientRequest { _events: [Object: null prototype], _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 7364, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: [TLSSocket], _header: 'POST /v1/chat/completions HTTP/1.1\r\n' + 'Accept: application/json, text/plain, */*\r\n' + 'Content-Type: application/json\r\n' + 'User-Agent: OpenAI/NodeJS/3.3.0\r\n' + 'Authorization: Bearer xxx\r\n' + 'Content-Length: 7364\r\n' + 'Host: api.openai.com\r\n' + 'Connection: close\r\n' + '\r\n', _keepAliveTimeout: 0, _onPendingData: [Function: nop], agent: [Agent], socketPath: undefined, method: 'POST', maxHeaderSize: undefined, insecureHTTPParser: undefined, joinDuplicateHeaders: undefined, path: '/v1/chat/completions', _ended: true, res: [IncomingMessage], aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, host: 'api.openai.com', protocol: 'https:', _redirectable: [Writable], [Symbol(kCapture)]: false, [Symbol(kBytesWritten)]: 0, [Symbol(kNeedDrain)]: false, [Symbol(corked)]: 0, [Symbol(kOutHeaders)]: [Object: null prototype], [Symbol(errored)]: null, [Symbol(kHighWaterMark)]: 16384, [Symbol(kRejectNonStandardBodyWrites)]: false, [Symbol(kUniqueHeaders)]: null }, data: { error: [Object] } }, isAxiosError: true, toJSON: [Function: toJSON] }
The text was updated successfully, but these errors were encountered: