Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AxiosError: maxContentLength size of -1 exceeded #4806

Closed
MarianoFacundoArch opened this issue Jun 23, 2022 · 34 comments
Closed

AxiosError: maxContentLength size of -1 exceeded #4806

MarianoFacundoArch opened this issue Jun 23, 2022 · 34 comments
Labels
state::triage Issue that is awaiting trial

Comments

@MarianoFacundoArch
Copy link

Doing an axiosRequest with
const a = await axios.get(URL)

and without any further configuration, it gives error:
' AxiosError: maxContentLength size of -1 exceeded'

The contentLength of the website is significant, >185KB, but as I did not configure any maxContentLength then this behaviour is nonsense.

What's the fix? Tried configuring a huge maxContentLength but still shows the error

@arossert
Copy link

arossert commented Jul 5, 2022

I have a similar issue when using proxy-agent and the proxy returning 407
I can see the real error hiding in the response error "Proxy Authentication Required"

{
  code: 'ERR_BAD_RESPONSE',
  config: {
    transitional: {
      silentJSONParsing: true,
      forcedJSONParsing: true,
      clarifyTimeoutError: false
    },
    adapter: [Function: httpAdapter],
    transformRequest: [ [Function: transformRequest] ],
    transformResponse: [ [Function: transformResponse] ],
    timeout: 0,
    xsrfCookieName: 'XSRF-TOKEN',
    xsrfHeaderName: 'X-XSRF-TOKEN',
    maxContentLength: -1,
    maxBodyLength: -1,
    env: { FormData: [Function] },
    validateStatus: [Function: validateStatus],
    headers: {
      Accept: 'application/json, text/plain, */*',
      'User-Agent': 'axios/0.27.2'
    },
    httpsAgent: ProxyAgent {
      proxy: [Url],
      proxyUri: 'http://user:[email protected]:3228',
      proxyFn: [Function: httpOrHttpsProxy],
      promisifiedCallback: [Function (anonymous)]
    },
    httpAgent: ProxyAgent {
      proxy: [Url],
      proxyUri: 'http://user:[email protected]:3228',
      proxyFn: [Function: httpOrHttpsProxy],
      promisifiedCallback: [Function (anonymous)]
    },
    method: 'get',
    url: 'https://www.google.com',
    data: undefined
  },
  request: <ref *1> ClientRequest {
    _events: [Object: null prototype] {
      abort: [Function (anonymous)],
      aborted: [Function (anonymous)],
      connect: [Function (anonymous)],
      error: [Function (anonymous)],
      socket: [Function (anonymous)],
      timeout: [Function (anonymous)],
      prefinish: [Function: requestOnPrefinish]
    },
    _eventsCount: 7,
    _maxListeners: undefined,
    outputData: [],
    outputSize: 128,
    writable: true,
    destroyed: true,
    _last: true,
    chunkedEncoding: false,
    shouldKeepAlive: false,
    _defaultKeepAlive: true,
    useChunkedEncodingByDefault: false,
    sendDate: false,
    _removedConnection: false,
    _removedContLen: false,
    _removedTE: false,
    _contentLength: 0,
    _hasBody: true,
    _trailer: '',
    finished: true,
    _headerSent: true,
    socket: Socket {
      connecting: false,
      _hadError: false,
      _parent: null,
      _host: null,
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 8,
      _maxListeners: undefined,
      _writableState: [WritableState],
      allowHalfOpen: false,
      _sockname: null,
      _pendingData: null,
      _pendingEncoding: '',
      server: null,
      _server: null,
      parser: null,
      _httpMessage: [Circular *1],
      write: [Function: writeAfterFIN],
      [Symbol(async_id_symbol)]: -1,
      [Symbol(kHandle)]: null,
      [Symbol(kSetNoDelay)]: false,
      [Symbol(lastWriteQueueSize)]: 0,
      [Symbol(timeout)]: null,
      [Symbol(kBuffer)]: null,
      [Symbol(kBufferCb)]: null,
      [Symbol(kBufferGen)]: null,
      [Symbol(kCapture)]: false,
      [Symbol(kBytesRead)]: 0,
      [Symbol(kBytesWritten)]: 0,
      [Symbol(RequestTimeout)]: undefined
    },
    _header: 'GET / HTTP/1.1\r\n' +
      'Accept: application/json, text/plain, */*\r\n' +
      'User-Agent: axios/0.27.2\r\n' +
      'Host: www.google.com\r\n' +
      'Connection: close\r\n' +
      '\r\n',
    _keepAliveTimeout: 0,
    _onPendingData: [Function: noopPendingOutput],
    agent: ProxyAgent {
      proxy: [Url],
      proxyUri: 'http://user:[email protected]:3228',
      proxyFn: [Function: httpOrHttpsProxy],
      promisifiedCallback: [Function (anonymous)]
    },
    socketPath: undefined,
    method: 'GET',
    maxHeaderSize: undefined,
    insecureHTTPParser: undefined,
    path: '/',
    _ended: false,
    res: IncomingMessage {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 4,
      _maxListeners: undefined,
      socket: [Socket],
      httpVersionMajor: 1,
      httpVersionMinor: 1,
      httpVersion: '1.1',
      complete: false,
      headers: [Object],
      rawHeaders: [Array],
      trailers: {},
      rawTrailers: [],
      aborted: true,
      upgrade: false,
      url: '',
      method: null,
      statusCode: 407,
      statusMessage: 'Proxy Authentication Required',
      client: [Socket],
      _consuming: true,
      _dumped: false,
      req: [Circular *1],
      responseUrl: 'https://www.google.com/',
      redirects: [],
      [Symbol(kCapture)]: false,
      [Symbol(RequestTimeout)]: undefined
    },
    aborted: false,
    timeoutCb: null,
    upgradeOrConnect: false,
    parser: null,
    maxHeadersCount: null,
    reusedSocket: false,
    host: 'www.google.com',
    protocol: 'https:',
    _redirectable: Writable {
      _writableState: [WritableState],
      _events: [Object: null prototype],
      _eventsCount: 3,
      _maxListeners: undefined,
      _options: [Object],
      _ended: true,
      _ending: true,
      _redirectCount: 0,
      _redirects: [],
      _requestBodyLength: 0,
      _requestBodyBuffers: [],
      _onNativeResponse: [Function (anonymous)],
      _currentRequest: [Circular *1],
      _currentUrl: 'https://www.google.com/',
      [Symbol(kCapture)]: false
    },
    [Symbol(kCapture)]: false,
    [Symbol(kNeedDrain)]: false,
    [Symbol(corked)]: 0,
    [Symbol(kOutHeaders)]: [Object: null prototype] {
      accept: [Array],
      'user-agent': [Array],
      host: [Array]
    }
  }
}

@colesiegel
Copy link

I am getting the same error when I try to post files using multipart/form-data #4885

@chenggaw
Copy link

chenggaw commented Aug 9, 2022

any update?

@zhujun24
Copy link

any update? is somewhere trigger response aborted? unlikely is server-side issue.

@pkaminski
Copy link

AFAICT the error is raised here, in response to an aborted event on the response stream. The message is nonsense when maxContentLength is set to -1 so I guess you should just ignore it and treat the error as a generic "request aborted" failure.

@colesiegel
Copy link

It actually obscures the real errors (e.g. 407 as mentioned in post by @arossert)

Other errors are also incorrectly handled by axios and hidden, see #4888

It makes it hard to know what your API should do when we cannot properly differentiate cause of errors, and correct status code is not returned.

@gujixingxing
Copy link

I also have the same problem.

@bsonmez
Copy link

bsonmez commented Sep 20, 2022

same here

@calinbogdan
Copy link

Any updates on this?

@UtpalKumarJha
Copy link

I am facing same problem too.

@tigerinus
Copy link

and update?

@jalethesh
Copy link

I get same issue.

1 similar comment
@jalethesh
Copy link

I get same issue.

@jalethesh
Copy link

jalethesh commented Nov 15, 2022 via email

@colesiegel
Copy link

Is this fixed now? I am confused by the update above ^

@js2me
Copy link

js2me commented Nov 16, 2022

>

@jalethesh @colesiegel

Sorry, I thought this was about swagger-typescript-api and create autoreply for this message ahahah,
swagger-typescript-api had the same issue, then I just removed axios and replace it to node-fetch

@pincombe
Copy link

pincombe commented Dec 7, 2022

I'm having problems with this too. Will do some investigating myself to see if i can determine the problem.

@louislam
Copy link

louislam commented Dec 8, 2022

I recently fixed the issue. In our cases, it was usually related to WordPress with a cache plugin. If you try curl https://your-domain, the response is broken too usually.

The fix is adding this header in your request:

"Accept-Encoding": "gzip, deflate",

You can read this for more info:
louislam/uptime-kuma#2253

Commit:
louislam/uptime-kuma@3e68cf2

@anselmo-coolstay
Copy link

This issue occurs to me with v1.2.2.
My server sends tens of thousands of similar type POST requests every day. But, I met this issue only one time.
The POST request uses JSON type data which has just 4 properties.

@1c7
Copy link

1c7 commented Feb 4, 2023

no updated?

@mori5321
Copy link

mori5321 commented Mar 10, 2023

I created PR. That's now in review.
#4917

@Johnrobmiller
Copy link

Johnrobmiller commented May 5, 2023

Me, when my 2MB image can't be fetched because the max content size is 4MB

image

@pistomat
Copy link

Any update on this? Can we at least merge #4917 please?

@MarkoOndrejicka
Copy link

The fix is adding this header in your request:

"Accept-Encoding": "gzip, deflate",

I looked closer at my request using Postman and noticed, that response headers includes Content-Encoding: gzip. I don't understand why it started causing issues out of nowhere, but your suggestion of adding the Accept-Encoding": "gzip request header helped me resolve this issue. Thank you @louislam !

@zhandosm
Copy link

I recently fixed the issue. In our cases, it was usually related to WordPress with a cache plugin. If you try curl https://your-domain, the response is broken too usually.

The fix is adding this header in your request:

"Accept-Encoding": "gzip, deflate",

You can read this for more info: louislam/uptime-kuma#2253

Commit: louislam/uptime-kuma@3e68cf2

Adding -> "Accept-Encoding": "gzip, deflate" as my headers fixed the issue for me
Thank you @louislam

I made the request on the backend to another server and I increased the content size up to 10mb, but still got the same error when my response was around 2 MB.

Didn't dig too deep into why it happens, but I just slightly skimmed over the "Check of running Websites results in error "maxContentLength size of -1 exceeded"" issue referred by @louislam

@colesiegel
Copy link

Best fix I found is to use got instead since this library clearly isn't maintained

mszag added a commit to mszag/homebridge-tasmota-control that referenced this issue Oct 18, 2023
it should fix the issue with axiom error: ' AxiosError: maxContentLength size of -1 exceeded'. There is open issue related to it in axiom github: axios/axios#4806. In hb it causes following errors: [10/18/2023, 5:11:17 PM] [homebridge-tasmota-control] Device: 192.168.0.220 DGM Plug1, check state error: AxiosError: timeout of 10000ms exceeded, trying again. I'm not sure if value of 2000 will be enough
@johenning
Copy link

This cost me several hours, I finally narrowed down the problem for my case: The issue was the transfer-encoding. If set to chunked axios would throw the above error (on chunked there is no content-length, because its dynamic). The underlying reason looks to be a closed socket in the node http adapter.

Parts of this might be due to my specific setup. I have Hoverfly as a mitm proxy between node and the real system. The real upstream system would return responses without chunked encoding, but Hoverfly would replay the request with chunked encoding.

So the request would work live, but not when replayed. It worked fine for a long time, the issue only appeared when upgrading from [email protected] to [email protected].

Now, I'm just speculating, but when comparing the original response and the replayed one I noticed a few things. Maybe the reason for the closed socket is an error in node itself, because the replay would work fine for curl & Postman.
In my specific example the data is interleaved with the connection being closed, I don't know enough about the standard to judge whether its the correct behavior, but I captured this with curl -vvv:

* TLSv1.2 (IN), TLS header, Supplemental data (23):
<Lots of JSON payload here...>* TLSv1.2 (IN), TLS header, Supplemental data (23):
* TLSv1.2 (IN), TLS header, Supplemental data (23):
* TLSv1.2 (IN), TLS header, Supplemental data (23):
* Closing connection 0
* TLSv1.2 (OUT), TLS header, Supplemental data (23):
* TLSv1.3 (OUT), TLS alert, close notify (256):<...Continued JSON payload>

So there is still some data arriving after the TLS connection has been closed. This is not the case without chunked encoding.
Could be completely unrelated, but maybe it saves someone else some time.

@michalss
Copy link

michalss commented Dec 31, 2023

AxiosError: maxContentLength size of Infinity exceeded

this is still happening to me for no reason... I did set infinity and no matter what it will always crash. This must be some nasty bug in axois.

Crashing here : AxiosError: maxContentLength size of Infinity exceeded
at IncomingMessage.handlerStreamAborted (D:\Private\Projects\NodeJS\MongoDB-Test\node_modules\axios\dist\node\axios.cjs:3034:23)

version "axios": "^1.6.3",

@Bigzo
Copy link

Bigzo commented Mar 7, 2024

I faced the same problem, adding the "Connection":"keep-alive" request header can resolve this problem.

@wenshan
Copy link

wenshan commented Mar 10, 2024

I get same issue.

@johenning
Copy link

I've had similar problems with the node@20 update and in the process found a correlation with the keep-alive setting, which meant I could fix it for my usecase.
I drew inspiration from these issues:

and continue to believe that a node http adapter issue is the underlying cause here

@satnam-sandhu
Copy link

I got the same issue

@sharifzadesina
Copy link

I also got the sane issue randomly, it is not really related to content length, it is probably fired when read stream is aborted for some reason.

Anyways, I also just switched away to ofetch.

@jvassev
Copy link

jvassev commented Jul 11, 2024

I also get this but in my case the problem was due to memory pressure. To fix it I just used responseType: 'stream' and piped it all.
I used to to download multiple files in parallel each about 50M in a container with limited memory. Was always working on my workstation with 64G :)

Jalle19 added a commit to Jalle19/eachwatt that referenced this issue Aug 8, 2024
Fixes stupid maxContentLength error on aborted requests (axios/axios#4806)
Jalle19 added a commit to Jalle19/eachwatt that referenced this issue Aug 8, 2024
Fixes stupid maxContentLength error on aborted requests (axios/axios#4806)
@jasonsaayman jasonsaayman added the state::triage Issue that is awaiting trial label Sep 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
state::triage Issue that is awaiting trial
Projects
None yet
Development

No branches or pull requests