Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 Upload API fails with ERR_STREAM_WRITE_AFTER_END #3404

Closed
3 tasks done
juanfrnz opened this issue Aug 20, 2020 · 14 comments
Closed
3 tasks done

S3 Upload API fails with ERR_STREAM_WRITE_AFTER_END #3404

juanfrnz opened this issue Aug 20, 2020 · 14 comments
Labels
closed-for-staleness guidance Question that needs advice or information.

Comments

@juanfrnz
Copy link

juanfrnz commented Aug 20, 2020

Confirm by changing [ ] to [x] below to ensure that it's a bug:

Describe the bug
Uploading large files of a few megabytes of size using the upload aws-sdk s3 API throws UnhandledPromiseRejectionWarning with ERR_STREAM_WRITE_AFTER_END and

Is the issue in the browser/Node.js?
Node.js

Details of the browser/Node.js version
Node.js v14.8.0

SDK version number
[email protected]

To Reproduce (observed behavior)
Upload a file of 2MB passing Body param with type ReadableStream.

const s3 = new aws.S3();
const Body = new PassThrough();

const params = {
      Bucket,
      Key,
      Body
};
s3.upload(params);
Body.pipe(stream);

Exception thrown:

(node:8) UnhandledPromiseRejectionWarning: Error [ERR_STREAM_WRITE_AFTER_END]: write after end
    at writeAfterEnd (_http_outgoing.js:643:15)
    at ClientRequest.end (_http_outgoing.js:763:7)
    at features.constructor.writeBody (/home/node/app/node_modules/aws-sdk/lib/http/node.js:137:14)
    at ClientRequest.<anonymous> (/home/node/app/node_modules/aws-sdk/lib/http/node.js:102:14)
    at ClientRequest.emit (events.js:315:20)
    at ClientRequest.EventEmitter.emit (domain.js:485:12)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (_http_client.js:605:11)
    at HTTPParser.parserOnHeadersComplete (_http_common.js:117:17)
    at TLSSocket.socketOnData (_http_client.js:507:22)
    at TLSSocket.emit (events.js:315:20)
    at TLSSocket.EventEmitter.emit (domain.js:485:12)
    at addChunk (_stream_readable.js:302:12)
    at readableAddChunk (_stream_readable.js:278:9)
    at TLSSocket.Readable.push (_stream_readable.js:217:10)
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:186:23)``

Additional context

I think the error is easy to be tracked down to the cases where long files upload triggers 100-Continue what executes https://github.com/aws/aws-sdk-js/blob/master/lib/http/node.js#L100 and https://github.com/aws/aws-sdk-js/blob/master/lib/http/node.js#L137 ending the stream before write ends.

@juanfrnz juanfrnz added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Aug 20, 2020
@ajredniwja
Copy link
Contributor

Hey @juanantoniofj thanks for bringing this up, I was not able to reproduce this and I used a fairly large file.

The code I used:

var aws = require('aws-sdk');
var fs = require('fs');

const s3 = new aws.S3();

const params = {
    Bucket: 'xyz',
    Key: 'abc',
    Body: fs.createReadStream('video.mov')
};

s3.upload(params, function (err, data) {
    if (err) console.log(err);
});

I believe this is more related to how the code is written. Please feel free to reach out if you have more inputs.

@ajredniwja ajredniwja added guidance Question that needs advice or information. response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. and removed bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Aug 27, 2020
@github-actions
Copy link

github-actions bot commented Sep 4, 2020

This issue has not received a response in 1 week. If you still think there is a problem, please leave a comment to avoid the issue from automatically closing.

@github-actions github-actions bot added closing-soon This issue will automatically close in 4 days unless further comments are made. closed-for-staleness and removed closing-soon This issue will automatically close in 4 days unless further comments are made. labels Sep 4, 2020
@github-actions github-actions bot closed this as completed Sep 8, 2020
@gramakri
Copy link
Contributor

I can consistently reproduce this with node v14.15.4 and Linode Object Storage. Curiously, I can upload fine to other object storages just fine (DO, S3, Minio to name a few). I am using [email protected]. The back trace is similar:

s3 upload error. Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at writeAfterEnd (_http_outgoing.js:668:15)
at ClientRequest.end (_http_outgoing.js:788:7)
at features.constructor.writeBody (/home/yellowtent/box/node_modules/aws-sdk/lib/http/node.js:137:14)
at ClientRequest.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/http/node.js:102:14)
at ClientRequest.emit (events.js:315:20)
at ClientRequest.EventEmitter.emit (domain.js:467:12)
at HTTPParser.parserOnIncomingClient [as onIncoming] (_http_client.js:606:11)
at HTTPParser.parserOnHeadersComplete (_http_common.js:126:17)
at TLSSocket.socketOnData (_http_client.js:509:22)
at TLSSocket.emit (events.js:315:20) {
code: 'NetworkingError',
region: 'us-east-1',
hostname: 'xxxx.us-east-1.linodeobjects.com',
retryable: true,
time: 2021-03-14T01:08:00.199Z

@gramakri
Copy link
Contributor

I tried to make a minimal example and it seems I can upload simple files just fine. In my code, it's the equivalent of "tar cvf - | s3_uploader". Looks like something with the source stream causes this issue. I will continue to try to make a minimal example.

@juanfrnz
Copy link
Author

Yes, the error is related to node v14. S3 relies on HTTP 100-continue for large files and aws-sdk is ending the pipe before it has finished. Probably not the cleanest solution but I was able to fix this as follow:

diff --git a/lib/http/node.js b/lib/http/node.js
index be69f691..d04fa7cc 100644
--- a/lib/http/node.js
+++ b/lib/http/node.js
@@ -99,7 +99,7 @@ AWS.NodeHttpClient = AWS.util.inherit({
     var expect = httpRequest.headers.Expect || httpRequest.headers.expect;
     if (expect === '100-continue') {
       stream.on('continue', function() {
-        self.writeBody(stream, httpRequest);
+        self.writeBody(stream, httpRequest, true);
       });
     } else {
       this.writeBody(stream, httpRequest);
@@ -108,7 +108,7 @@ AWS.NodeHttpClient = AWS.util.inherit({
     return stream;
   },

-  writeBody: function writeBody(stream, httpRequest) {
+  writeBody: function writeBody(stream, httpRequest, isContinue = false) {
     var body = httpRequest.body;
     var totalBytes = parseInt(httpRequest.headers['Content-Length'], 10);

@@ -134,7 +134,8 @@ AWS.NodeHttpClient = AWS.util.inherit({
           total: totalBytes
         });
       });
-      stream.end(body);
+      if (isContinue) stream.write(body);
+      else stream.end(body);
     } else {
       // no request body
       stream.end();

@gramakri
Copy link
Contributor

@juanantoniofj amazing! I can confirm your patch works.

Can someone from the AWS team re-open this bug and also comment on the patch? Thanks!

@gramakri
Copy link
Contributor

#437 added 100-continue by default.

@gramakri
Copy link
Contributor

I narrowed this problem to Linode Object Storage sending 101 CONTINUE twice. The response from linode looks like this:

HTTP/1.1 100 Continue

HTTP/1.1 100 CONTINUE
Date: Wed, 24 Mar 2021 01:31:03 GMT
Connection: keep-alive

HTTP/1.1 200 OK
Content-Length: 0
ETag: "08db8030b2f1434199d74a4e8189c954"
Accept-Ranges: bytes
x-amz-request-id: tx00000000000000db14129-00605a9657-1f74dbc-default
Date: Wed, 24 Mar 2021 01:31:04 GMT
Connection: close

This causes the code here to pipe the body twice and this results in an error. I have also left a note at https://www.linode.com/community/questions/21180/object-storage-upload-err_stream_write_after_end-write-after-end#answer-75978 . I don't know if the above is correct HTTP or not.

gramakri added a commit to cloudron-io/aws-sdk-js that referenced this issue Mar 24, 2021
Linode Object Storage sends multiple '100 continue' lines as a response
to 'Expect: 100-continue'. This causes node.js to parse the response
correctly and fire the 'continue' event twice.

See also aws#3404
gramakri added a commit to cloudron-io/aws-sdk-js that referenced this issue Mar 24, 2021
Linode Object Storage sends multiple '100 continue' lines as a response
to 'Expect: 100-continue'. This causes node.js to parse the response
correctly and fire the 'continue' event twice.

See also aws#3404
@ajredniwja ajredniwja reopened this Mar 24, 2021
@ajredniwja
Copy link
Contributor

I have reopened the issue, sorry for the delay, but is this still a problem?

@gramakri
Copy link
Contributor

@ajredniwja thanks for re-opening. Yes, this is this a problem and easily reproduced with Linode Object Storage. I have submitted a PR at #3674 . Linode servers respond like below to Expect: 100-continue. I haven't been able to figure out if the response below is valid HTTP since it sends 100 continue twice. But it seems node.js atleast parses it correctly, so it's probably valid.

HTTP/1.1 100 Continue

HTTP/1.1 100 CONTINUE
Date: Wed, 24 Mar 2021 01:31:03 GMT
Connection: keep-alive

HTTP/1.1 200 OK
Content-Length: 0
ETag: "08db8030b2f1434199d74a4e8189c954"
Accept-Ranges: bytes
x-amz-request-id: tx00000000000000db14129-00605a9657-1f74dbc-default
Date: Wed, 24 Mar 2021 01:31:04 GMT
Connection: close

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Mar 25, 2021
@gramakri
Copy link
Contributor

@ajredniwja this is still a problem, any thoughts on this?

@ajredniwja
Copy link
Contributor

I havent reproduced it with Linode Object Storage as I am not so familiar with that. @gramakri
I will assign someone to look at the PR.

AllanZhengYP pushed a commit that referenced this issue Apr 2, 2021
Linode Object Storage sends multiple '100 continue' lines as a response
to 'Expect: 100-continue'. This causes node.js to parse the response
correctly and fire the 'continue' event twice.

See also #3404
@andresribeiro
Copy link

I was having the same problem and, testing this branch from this fork, it worked perfectly

@gramakri
Copy link
Contributor

gramakri commented Apr 5, 2021

@ajredniwja #3674 is merged now, so this issue can be closed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
closed-for-staleness guidance Question that needs advice or information.
Projects
None yet
Development

No branches or pull requests

4 participants