Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[core-http] upload 4000MB ArrayBuffer throw "invalid typed array length" #9481

Closed
ljian3377 opened this issue Jun 11, 2020 · 10 comments
Closed
Assignees
Labels
Azure.Core Client This issue points to a problem in the data-plane of the library.

Comments

@ljian3377
Copy link
Member

ljian3377 commented Jun 11, 2020

ArrayBuffer does not have a hard limit while typed array has.
Assigning to core-http as the request is logged, but do not seem to hit the server.

To reproduce:

  const FILE_UPLOAD_MAX_CHUNK_SIZE  = 4000* MB;
  it.skip("upload with chunkSize = FILE_UPLOAD_MAX_CHUNK_SIZE should succeed", async () => {
    const fileSize = FILE_UPLOAD_MAX_CHUNK_SIZE * 2 + MB;
    const arrayBuf = new ArrayBuffer(fileSize);
    try {
      await fileClient.upload(arrayBuf, {
        chunkSize: FILE_UPLOAD_MAX_CHUNK_SIZE,
        abortSignal: AbortController.timeout(20 * 1000) // takes too long to upload the file
      });
    } catch (err) {
      assert.equal(err.name, 'AbortError');
    }
  }).timeout(timeoutForLargeFileUploadingTest);
@ghost ghost added the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Jun 11, 2020
@ramya-rao-a ramya-rao-a added Azure.Core Client This issue points to a problem in the data-plane of the library. labels Jun 11, 2020
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Jun 11, 2020
@jeremymeng
Copy link
Member

@ljian3377 does this repro in the nodejs test? When I tried in a standalone NodeJS app, it went to the browser branch
image

and throws this error

ReferenceError: Blob is not defined

@ljian3377
Copy link
Member Author

Sadly we reverted the code so it now goes the browser branch.
You can reproduce with this in our storage/stg73bse branch. Test account needs Jumbo Blob enabled. I pinged you a test account.
But this seems to be a bug in node-fetch 😢

  it.only("put blob with maximum size", async () => {
    recorder.skip("node", "Temp file - recorder doesn't support saving the file");
    const maxPutBlobSizeLimit = 5000 * 1024 * 1024;
    const arrBuf = new ArrayBuffer(maxPutBlobSizeLimit);

    try {
      await blockBlobClient.upload(arrBuf, maxPutBlobSizeLimit, {
        abortSignal: AbortController.timeout(20 * 1000) // takes too long to upload the file
      });
    } catch (err) {
      assert.equal(err.name, "AbortError");
    }
  }).timeout(timeoutForLargeFileUploadingTest);
"RangeError: Invalid typed array length: 5242880000
    at new Uint8Array (<anonymous>)
    at new FastBuffer (internal/buffer.js:940:1)
    at fromArrayBuffer (buffer.js:457:10)
    at Function.from (buffer.js:276:14)
    at Request.Body (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\node-fetch\2.6.0\node_modules\node-fetch\lib\index.js:197:17)
    at new Request (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\node-fetch\2.6.0\node_modules\node-fetch\lib\index.js:1198:8)
    at S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\node-fetch\2.6.0\node_modules\node-fetch\lib\index.js:1403:19
    at new Promise (<anonymous>)
    at fetch (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\node-fetch\2.6.0\node_modules\node-fetch\lib\index.js:1401:9)
    at NodeFetchHttpClient.<anonymous> (S:\dev\azure-sdk-for-js\sdk\core\core-http\src\nodeFetchHttpClient.ts:91:12)
    at step (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:140:27)
    at Object.next (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:121:57)
    at S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:114:75
    at new Promise (<anonymous>)
    at Object.__awaiter (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:110:16)
    at NodeFetchHttpClient.fetch (S:\dev\azure-sdk-for-js\sdk\core\core-http\dist\coreHttp.node.js:2551:22)
    at NodeFetchHttpClient.<anonymous> (S:\dev\azure-sdk-for-js\sdk\core\core-http\src\fetchHttpClient.ts:136:45)
    at step (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:140:27)
    at Object.next (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:121:57)
    at fulfilled (S:\dev\azure-sdk-for-js\common\temp\node_modules\.pnpm\registry.npmjs.org\tslib\2.0.0\node_modules\tslib\tslib.js:111:62)
    at processTicksAndRejections (internal/process/task_queues.js:93:5)"

@jeremymeng
Copy link
Member

This seems a limitation in NodeJs

> const arrBuff = new ArrayBuffer(5000*1024*1024)
undefined
> Buffer.from(arrBuff)
Uncaught RangeError: Invalid typed array length: 5242880000
    at new Uint8Array (<anonymous>)
    at new FastBuffer (internal/buffer.js:944:1)
    at fromArrayBuffer (buffer.js:477:10)
    at Function.from (buffer.js:292:14)

@jeremymeng
Copy link
Member

So node-fetch shouldn't try to convert ArrayBuffer to array

@jeremymeng
Copy link
Member

we might be able to work around it by converting large ArrayBuffer into Stream internally

    const bufferSize = 1000 * MB;
    let offset = 0;
    const inputStream = new PassThrough();

    while (offset + bufferSize < arrBuf.byteLength) {
      inputStream.push(Buffer.from(arrBuf, offset, bufferSize));
      offset += bufferSize;
    }
    inputStream.push(Buffer.from(arrBuf, offset, arrBuf.byteLength - offset));
    inputStream.push(null);

    try {
      await blockBlobClient.upload(() => inputStream, maxPutBlobSizeLimitInMB * MB, {
        //abortSignal: AbortController.timeout(20 * 1000) // takes too long to upload the file
        onProgress: (state) => {
          console.log(state.loadedBytes);
        }
      });
    } catch (err) {

@ljian3377
Copy link
Member Author

Are you suggesting we do this in our BlockBlobClient.upload(Buffer) funtion or just a workaround for customers?

@jeremymeng
Copy link
Member

I was thinking about doing this in upload() when the size of array buffer is too large. And open an issue in node-fetch. Doing it in core-http is another option but so far no other libraries has this need.

@ramya-rao-a
Copy link
Contributor

@ljian3377, Can you confirm that this is an issue due to node-fetch? If so, then moving to the new Azure Core v2 should fix this

@jeremymeng
Copy link
Member

Yes it's caused by node-fetch issue node-fetch/node-fetch#893.

@ramya-rao-a
Copy link
Contributor

@ljian3377 Given that this is a limitation in node-fetch, there is nothing much we can do here.
When you move to the new Azure Core v2, this should get resolved.
If not, please feel free to re-open the issue then

@xirzec xirzec removed this from the Backlog milestone May 18, 2022
@github-actions github-actions bot locked and limited conversation to collaborators Apr 12, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Azure.Core Client This issue points to a problem in the data-plane of the library.
Projects
None yet
Development

No branches or pull requests

4 participants