Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3.send sometimes leads to application exit with code 0 #4332

Closed
bes opened this issue Jan 9, 2023 · 46 comments
Closed

S3.send sometimes leads to application exit with code 0 #4332

bes opened this issue Jan 9, 2023 · 46 comments
Assignees
Labels
bug This issue is a bug. closed-for-staleness p2 This is a standard priority issue

Comments

@bes
Copy link

bes commented Jan 9, 2023

Describe the bug

Executing this code...

        let s3: S3 = /*...*/;

        // Many awaits

        await s3.send(
            new PutObjectCommand({
                Bucket: bucket,
                Key: toFile,
                Body: data,
                CacheControl: cacheControl,
                ContentType: contentType ?? undefined,
            }),
        );

        // More awaits

...in the middle of a long chain of awaits sometimes leads to application exit with code 0.

I believe this may be because of the behavior described here: nodejs/node#22088
Basically, I think, s3.send somehow drops both its handle to resolve and reject, dropping the reference count of scheduled requests in the node runtime to 0, which in turn automatically exits node with code 0.

This took a long time to figure out and is very confusing when encountering it for the first time.

The reason I think this is what happens, is because when i use the setTimeout trick, the application stays alive until setTimeout triggers and then immediately exits with code 0.

Example of this:

        const t = setTimeout(() => console.log("Timeout"), 5000);
        await s3.send(
            new PutObjectCommand({
                Bucket: bucket,
                Key: toFile,
                Body: data,
                CacheControl: cacheControl,
                ContentType: contentType ?? undefined,
            }),
        );
        clearTimeout(t);

It seems to happen more often on 4G connections than on WiFi.

Expected Behavior

An actual error to be thrown or printed.

Current Behavior

My node application exits with code 0 even though all code has not been run yet.

Reproduction Steps

Since the error is intermittent, and I have a lot of proprietary code, I can't give a better example than this:

        let s3: S3 = /*...*/;

        // Many awaits

        await s3.send(
            new PutObjectCommand({
                Bucket: bucket,
                Key: toFile,
                Body: data,
                CacheControl: cacheControl,
                ContentType: contentType ?? undefined,
            }),
        );

        // More awaits

Possible Solution

Hopefully the code in the aws-sdk-js repository can be fixed.

Additional Information/Context

No response

SDK version used

3.245.0

Environment details (OS name and version, etc.)

macOS 13.0.1 (22A400)
node v18.8.0
scripts compiled & run using ts-node

@bes bes added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jan 9, 2023
@ajredniwja
Copy link
Contributor

Transferring to V3 repo of SDK

@ajredniwja ajredniwja transferred this issue from aws/aws-sdk-js Jan 9, 2023
@RanVaknin RanVaknin self-assigned this Jan 9, 2023
@RanVaknin
Copy link
Contributor

RanVaknin commented Jan 9, 2023

Hi @bes,

The reason I think this is what happens, is because when i use the setTimeout trick, the application stays alive until setTimeout triggers and then immediately exits with code 0.

I'm not sure what the "trick" is?

Since there is no complete repro code I can't speak to why you are experiencing this, but it seems like you are not doing any sort of error handling. If the app crashes its probably due to an error coming from the server that you are not handling.
Consider wrapping your SDK command with a try/catch block:

try{
        await s3.send(
            new PutObjectCommand({
                Bucket: bucket,
                Key: toFile,
                Body: data,
                CacheControl: cacheControl,
                ContentType: contentType ?? undefined,
            }),
        )
} catch (error){
      console.log("there is an error! ", error)
}

If after you've implemented error handling your app still crashes unexpectedly, please include a reproducible code snippet we can test on our end to see what can be done to help.

Thank you!
Ran~

@RanVaknin RanVaknin added response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. and removed needs-triage This issue or PR still needs to be triaged. labels Jan 9, 2023
@bes
Copy link
Author

bes commented Jan 9, 2023

Hi @RanVaknin

Thank you for your response.

I am using error handling.

A more complete example:

const pushToS3 = async (semaphore: Semaphore, s3: S3, bucket: string, data: Buffer, toFile: string): Promise<void> => {
    const release = await semaphore.acquire();
    try {
        // ... stuff
        const t = setTimeout(() => console.log("This is printed just before exit code 0"), 5000);
        const x = await s3.send(
            new PutObjectCommand({
                Bucket: bucket,
                Key: toFile,
                Body: data,
                CacheControl: cacheControl,
                ContentType: contentType ?? undefined,
            }),
        );
        clearTimeout(t);
        console.log("Intermittently never reached!");
    } catch (e) {
        console.log(e);
    } finally {
        release();
    }
};

There is no error log.
There is no log of "Intermittently never reached" when it happens.
Only "This is printed just before exit code 0".
And then exit code 0.

@bes
Copy link
Author

bes commented Jan 9, 2023

If I leave out setTimeout I get exit code 0 shortly after doing s3.send. If I keep it in, I get it after exactly 5 seconds.

@bes
Copy link
Author

bes commented Jan 9, 2023

We are on an AWS support plan, is there a way of sharing our code through support?

@bes
Copy link
Author

bes commented Jan 9, 2023

Could it be a race condition in S3? I am issuing multiple pushToS3 at the same time. Does a single S3 instance support multiple simultaneous uploads?

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 10, 2023
@RanVaknin
Copy link
Contributor

RanVaknin commented Jan 10, 2023

Hi @bes ,

If I leave out setTimeout I get exit code 0 shortly after doing s3.send. If I keep it in, I get it after exactly 5 seconds.

Are you just using setTimeout here to isolate the issue?

We are on an AWS support plan, is there a way of sharing our code through support?

You can, but I think you shouldn't. Those internal support tickets end up in our queue, and you will have to talk to us through a technical account manager. I recommend talking to us directly here.

Could it be a race condition in S3? I am issuing multiple pushToS3 at the same time. Does a single S3 instance support multiple simultaneous uploads?

Unless you're doing thousands of writes per second I don't think the service will throttle, and even if it does, it will throw some 5xx error that should be retried by the Retryer, and also logged to the console.

The line Im suspecting is causing issues for you is the following one:
const release = await semaphore.acquire();
I'm not too familiar with this library, but from the brief reading I've done, it seems like it has some functionality around threading and concurrency and my guess its causing this behavior. Try to remove that line and test without it.

As an experiment I've refactored your code and ran this successfully:

// file named sample.js

import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'

const client = new S3Client({})

const pushToS3 = async (s3client, bucket, data, key, cacheControl, contentType) => {
    try {
        await s3client.send(
            new PutObjectCommand({
                Bucket: bucket,
                Key: key,
                Body: data,
                CacheControl: cacheControl,
                ContentType: contentType ?? undefined,
            }),
        );
        console.log("Intermittently never reached!");
    } catch (e) {
        console.log(e);
    }
};

for (let i = 0; i < 20; i++) {
    pushToS3(
        client,
        "testbucket-3650",
        `hello-world${Date.now()}`, //unique body
        `key${i}.txt`,
        "max-age=99",
        "text/html; charset=utf-8"
    )
}

Output:

$ node sample.js
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!
Intermittently never reached!

Let me know if this helps.
Thanks,
Ran~

@RanVaknin RanVaknin added the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 10, 2023
@bes
Copy link
Author

bes commented Jan 10, 2023

Hi @RanVaknin thank you for your suggestion. I have removed all uses of Semaphore from my code, but it still happens. I'm happy to help if you have more suggestions for how to dig deeper.

EDIT: Please note that it doesn't happen all the time. It's an intermittent issue. Between 1/5 to 1/20.

@bes
Copy link
Author

bes commented Jan 10, 2023

This is a typical error flow:

Intermittently never reached!
Intermittently never reached!
Timeout

With exit code 0

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 11, 2023
@RanVaknin
Copy link
Contributor

Hi @bes,

I spent quite a bit of time reading through the initial issue you linked from Node's repo. Additionally, I looked at how we handle http requests and I have not found a case in which we drop reference to the promise. From an error, to socket timeout, we handle all cases and raise the appropriate errors.

I really think this is a Node issue based on how much traction that issue got on that thread.

Another thing you can do is add an additional await after your s3.putObject call, to see if that one is called.
Perhaps there is an issue with the putObject call being the last await in your chain?

Since the error is intermittent, and I have a lot of proprietary code, I can't give a better example than this:

Unfortunately I'm not able to reproduce your issue with what we have so far. If you can provide a more comprehensive code sample that raises that error, even intermittently, I can try to dig deeper into the issue at hand.

Thank you!
Ran~

@RanVaknin RanVaknin added the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 11, 2023
@bes
Copy link
Author

bes commented Jan 11, 2023

Thank you for your effort!

May I ask what version of node you used when you tested? I want to try again with that same version on my code.

@RanVaknin
Copy link
Contributor

Hi @bes,

My pleasure,

I use Node 18.0.0.

Thanks,
Ran~

@bes
Copy link
Author

bes commented Jan 11, 2023

@RanVaknin Alright here is a minimal proof of concept that gives exit code 0 within a few seconds: https://github.com/bes/aws-sdk-js-v3-s3-exit-0

How I ran it:

AWS Credentials
aws sso login --profile default
yawsso --default-only

Node / Yarn
node --version
v18.8.0
yarn --version
1.22.19

Install command:
yarn install

Run command:
yarn run poc

Output:

yarn run v1.22.19
$ TS_NODE_PROJECT="tsconfig.json" ts-node build.ts
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
✨  Done in 4.50s.

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 12, 2023
@RanVaknin
Copy link
Contributor

RanVaknin commented Jan 12, 2023

Hi @bes,

Thanks for providing me with the repo.

I refactored your code a bit and let the program run for about 5-6 minutes before stopping it.
It managed to write a few thousand objects to s3 successfully without exiting:

image

I usually use unique Object bodies when testing because otherwise the Etag would be the same, and it would be harder to debug. So Something like:

Body: `hello ${Date.now()}`

The only time my application crashed was when I ran the request with an empty buffer like in your example:

Body: Buffer.from("", "utf-8")

The error I got was

node[42307]: ../src/node_http_parser.cc:589:static void node::(anonymous namespace)::Parser::Execute(const FunctionCallbackInfo<v8::Value> &): Assertion `parser->current_buffer_.IsEmpty()' failed.
 1: 0x1025c2e28 node::Abort() [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 2: 0x1025c2c68 node::AppendExceptionLine(node::Environment*, v8::Local<v8::Value>, v8::Local<v8::Message>, node::ErrorHandlingMode) [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 3: 0x1025dc004 node::(anonymous namespace)::Parser::Execute(v8::FunctionCallbackInfo<v8::Value> const&) [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 4: 0x102785274 v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 5: 0x102784d58 v8::internal::MaybeHandle<v8::internal::Object> v8::internal::(anonymous namespace)::HandleApiCallHelper<false>(v8::internal::Isolate*, v8::internal::Handle<v8::internal::HeapObject>, v8::internal::Handle<v8::internal::HeapObject>, v8::internal::Handle<v8::internal::FunctionTemplateInfo>, v8::internal::Handle<v8::internal::Object>, v8::internal::BuiltinArguments) [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 6: 0x1027845e4 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 7: 0x102f57a2c Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_BuiltinExit [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 8: 0x102ee1b98 Builtins_InterpreterEntryTrampoline [/Users/rvaknin/.nvm/versions/node/v18.0.0/bin/node]
 ...
 ...
 ...
 ...

And this went away when I commented out process.exit(0) which was my lead suspect to begin with.
Did you try commenting that out?

Thanks,
Ran~

@RanVaknin RanVaknin added the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 12, 2023
@bes
Copy link
Author

bes commented Jan 12, 2023

@RanVaknin There is only one process.exit(0), the one here

main()
    .then(() => {
        console.log("Build done");
        // process.exit(0);
    })
    .catch((e) => {
        console.error(e);
        // process.exit(-1);
    });

I removed that one, and the poc failed again with exit code 0

I have a setup where I need to create an "empty file" on S3, it is used as a marker of sorts. It is surprising that this error should happen with empty files, actually it should exit with at least a stack trace. But it doesn't for me, unless I am misunderstanding something.

Feel free to create a branch on my poc git with how you get an error / crash / stack trace.

@bes
Copy link
Author

bes commented Jan 12, 2023

Example

~/web-aws-issue% git --no-pager diff                                                                                                                                                                                                                                                                                                                                                                         255 ↵
diff --git a/build.ts b/build.ts
index 16de02a..6c3b6e8 100644
--- a/build.ts
+++ b/build.ts
@@ -66,9 +66,9 @@ const pushToS3 = async (s3: S3, bucket: string, data: Buffer, toFile: string): P
 main()
     .then(() => {
         console.log("Build done");
-        process.exit(0);
+        // process.exit(0);
     })
     .catch((e) => {
         console.error(e);
-        process.exit(-1);
+        // process.exit(-1);
     });
09:23:27 bes ‹master* M›
~/web-aws-issue% yarn run poc
yarn run v1.22.19
$ TS_NODE_PROJECT="tsconfig.json" ts-node build.ts
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
DONE
START
Missing Content-Type for test/2, defaulting to application/octet-stream
Uploading s3://bucket-xyz-example/test/2 [Content-Type: application/octet-stream, Cache-Control: undefined]
✨  Done in 1.84s.

As you can see, it failed on the second upload. It shouldn't matter that I am pushing to "the same file" or that I am using "an empty file". It just shouldn't behave this way?

@mohghaderi
Copy link

Could anyone please suggest a workaround?

It's been more than a year and I see no fix. It causes Electron to crash and we can't ship a product like that. aws-sdk v3 is useless for us due to this error, and v2 uses old version of libs that we don't want to risk security. It is nice that at least we have a workaround.

I commented this out on node repo and they suggest that it is AWS SDK. I believe they are right, but I am not sure how AWS can cause such a crash in Node.js. It seems this is happening more often with choppy or slow internet connection.

@marco-ippolito
Copy link

marco-ippolito commented Feb 16, 2023

I confirm it happens on a slow internet more often, I've tested it with slow mobile data.
I think the problem is somewhere here

retryToken = await retryStrategy.refreshRetryTokenForRetry(retryToken, retryErrorInfo);
} catch (refreshError) {
if (!lastError.$metadata) {
lastError.$metadata = {};
}
lastError.$metadata.attempts = attempts + 1;
lastError.$metadata.totalRetryDelay = totalRetryDelay;
throw lastError;
}
attempts = retryToken.getRetryCount();
const delay = retryToken.getRetryDelay();
totalRetryDelay += delay;
await new Promise((resolve) => setTimeout(resolve, delay));

@mohghaderi
Copy link

Based on marco-ippolito's answer, here are few potential lines that maybe problematic:

Line 45: while(true) may cause infinite loop if retryToken does not work as expected. Maybe it can be changed to maxAttempts or some guard to ensure prevention of infinity.

Delay coming back from attempts = retryToken.getRetryCount(); maybe large that causes setTimeout to wait infinite and cause crash

const delay = retryToken.getRetryDelay();

Maybe falling back on V1 strategy and infinite recursion happen line 79 (return retryStrategy.retry(next, args);)

return retryStrategy.retry(next, args);

@marco-ippolito
Copy link

marco-ippolito commented Feb 17, 2023

The process is not crashing, it is exiting because the event loop has no active handle. My guess is that there is a race condition.
If you remove the httpRequest.on... and keep the writeBody it works fine

httpRequest.on("continue", () => {
writeBody(httpRequest, request.body);
});

@marco-ippolito
Copy link

I open a PR for this: #4492 @trivikr

@dozzes
Copy link

dozzes commented Mar 21, 2023

I have a similar issue with the parallel downloading.

import { S3Client, ListObjectsV2Command, GetObjectCommand } from '@aws-sdk/client-s3'

(async () => {
    const client = new S3Client({
        credentials:{
            accessKeyId:'XXXXXXX',
            secretAccessKey:'XXXXX'
        },
        region: "us-east-1"
    })

    const bucket_name = 'Bucket'
    const input = {
       Bucket: bucket_name,
     }
    const list = await client.send(new ListObjectsV2Command(input)) 
    
    // Here I have 1000 files 38535294 bytes each

    await Promise.all(
        list.Contents.map(async file => {
            console.log(`Downloading file ${file.Key} [...]`);
            await client.send(
                new GetObjectCommand({
                    Bucket: bucket_name,
                    Key: file.Key
                }),
            );
            console.log(`File ${file.Key} downloaded`);
        }),
    );

    //
    // Doesn't reach this  console.log(). The process exits after 50 downloads with exit status 13:
    //  "Unfinished Top-Level Await: await was used outside of a function in the top-level code,
    //  but the passed Promise never resolved."
    //
    console.log('All files downloaded!');
})();

@empro-heikowissmann
Copy link

We encountered this issue as well, but not while processing a loop but by invoking a lambda function multple times in a short period of time:

const getImageLastModified = async (bucket: string, key: string): Promise<Date | undefined> => {
        try {
            const output = await s3Client.send(new GetObjectCommand({
                Bucket: bucket,
                Key: key,
            }))
            return output.LastModified
        } catch (e) {
            const error = e as AwsSdkS3Error
            if ('NoSuchKey' === error.name) {
                return undefined
            }
        }
    }

We implemented the following workaround to fix the issue for now:

const keepAliveTimer = setTimeout(() => { /* do nothing but prevent node process from exiting */}, 5000)
const lastModified = await getImageLastModified('bucket','key')
clearTimeout(keepAliveTimer)

Are other SDK clients send functions also impacted by this issue or is the s3 send function the only one?

@marco-ippolito
Copy link

I've opened a PR to fix this #4492

@kuhe
Copy link
Contributor

kuhe commented May 23, 2023

could someone who was previously able to reproduce this check if it's present in the latest version? https://github.com/aws/aws-sdk-js-v3/releases/tag/v3.337.0

An attempted fix was released based on the information in this issue, but since I wasn't able to reproduce it using the samples, I can't tell whether it's working.

@kuhe kuhe added the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jun 6, 2023
@github-actions
Copy link

github-actions bot commented Jun 7, 2023

This issue has not received a response in 1 week. If you still think there is a problem, please leave a comment to avoid the issue from automatically closing.

@github-actions github-actions bot added the closing-soon This issue will automatically close in 4 days unless further comments are made. label Jun 7, 2023
@RanVaknin RanVaknin added p2 This is a standard priority issue and removed response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. closing-soon This issue will automatically close in 4 days unless further comments are made. p1 This is a high priority issue labels Jun 7, 2023
@RanVaknin
Copy link
Contributor

Hi @bes @bracki @mohghaderi @heikowissmann @dozzes,

Can you please test with latest release to see if this is solved?

Thanks,
Ran~

@stefansundin
Copy link

stefansundin commented Jun 7, 2023

I started having a similar error when I put Caddy in front of minio (open source S3 server). I used Caddy as a reverse proxy to terminate TLS.

Here's my error:

  err: {
   "type": "Error",
   "message": "write after end",
   "stack":
       Error [ERR_STREAM_WRITE_AFTER_END]: write after end
           at new NodeError (node:internal/errors:399:5)
           at ClientRequest.end (node:_http_outgoing:1010:15)
           at writeBody (/app/node_modules/@aws-sdk/node-http-handler/dist-cjs/write-request-body.js:22:21)
           at ClientRequest.<anonymous> (/app/node_modules/@aws-sdk/node-http-handler/dist-cjs/write-request-body.js:9:13)
           at ClientRequest.emit (node:events:513:28)
           at HTTPParser.parserOnIncomingClient [as onIncoming] (node:_http_client:643:11)
           at HTTPParser.parserOnHeadersComplete (node:_http_common:119:17)
           at TLSSocket.socketOnData (node:_http_client:542:22)
           at TLSSocket.emit (node:events:513:28)
           at addChunk (node:internal/streams/readable:324:12)
  }

That error lead me to this AWS SDK v2 issue: aws/aws-sdk-js#3674

I realized that AWS SDK v3 was also using .on("continue", ...) so I changed it to once which fixed my problem. At this point I was going to open a pull request with that change, but I realized that I wasn't on the latest version of the SDK (I was using 3.332.0) and that there had been a bunch of changes in this file. So I updated my SDK and I no longer need the once fix. I have a feeling that the SDK might still not be handling it properly and figured I'd chime in with what I found. Hope it is helpful.

@RanVaknin RanVaknin removed the needs-review This issue/pr needs review from an internal developer. label Jun 13, 2023
@mohghaderi
Copy link

@RanVaknin It passed our QA process, and we could upload several GB of test data to AWS S3 without any error (v3.348.0). Thanks.

@kuhe kuhe added the closing-soon This issue will automatically close in 4 days unless further comments are made. label Jun 21, 2023
@github-actions github-actions bot added closed-for-staleness and removed closing-soon This issue will automatically close in 4 days unless further comments are made. labels Jun 22, 2023
@ftr-lwilson
Copy link

Hi there!

I have observed similar behaviour with the process exiting without explanation using the following code:

import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3";

const BUCKET = "some bucket";
const KEY = "some key";
const REGION = "some region";

const log = (message: string) => {
  console.log(`${new Date().toISOString()}: ${message}`);
};

const loadTest = async () => {
  const s3Client = new S3Client({ region: REGION });
  let requestCount = 0;

  while (true) {
    requestCount += 1;
    const requestStartTime = performance.now();
    log(`Request ${requestCount} beginning`);
    try {
      await s3Client.send(new GetObjectCommand({ Bucket: BUCKET, Key: KEY }));
    } catch (e) {
      console.log(e);
    }
    const requestEndTime = performance.now();
    const requestDuration = requestEndTime - requestStartTime;
    log(
      `Request ${requestCount} completed in ${Math.round(requestDuration)}ms`
    );
  }
};

loadTest();

When running, it seems to make exactly 50 successful requests, and the exits with code 0 on the 51st one. There are no errors.

Log output:

2023-06-29T04:05:15.891Z: Request 1 beginning
2023-06-29T04:05:16.062Z: Request 1 completed in 171ms
2023-06-29T04:05:16.062Z: Request 2 beginning
2023-06-29T04:05:16.397Z: Request 2 completed in 336ms
2023-06-29T04:05:16.398Z: Request 3 beginning
2023-06-29T04:05:16.624Z: Request 3 completed in 226ms
2023-06-29T04:05:16.624Z: Request 4 beginning
2023-06-29T04:05:16.850Z: Request 4 completed in 226ms
2023-06-29T04:05:16.850Z: Request 5 beginning
2023-06-29T04:05:17.184Z: Request 5 completed in 333ms
2023-06-29T04:05:17.184Z: Request 6 beginning
2023-06-29T04:05:17.492Z: Request 6 completed in 308ms
2023-06-29T04:05:17.492Z: Request 7 beginning
2023-06-29T04:05:17.778Z: Request 7 completed in 286ms
2023-06-29T04:05:17.778Z: Request 8 beginning
2023-06-29T04:05:17.990Z: Request 8 completed in 212ms
2023-06-29T04:05:17.990Z: Request 9 beginning
2023-06-29T04:05:18.316Z: Request 9 completed in 325ms
2023-06-29T04:05:18.316Z: Request 10 beginning
2023-06-29T04:05:18.662Z: Request 10 completed in 346ms
2023-06-29T04:05:18.662Z: Request 11 beginning
2023-06-29T04:05:19.062Z: Request 11 completed in 400ms
2023-06-29T04:05:19.062Z: Request 12 beginning
2023-06-29T04:05:19.305Z: Request 12 completed in 243ms
2023-06-29T04:05:19.306Z: Request 13 beginning
2023-06-29T04:05:19.755Z: Request 13 completed in 449ms
2023-06-29T04:05:19.755Z: Request 14 beginning
2023-06-29T04:05:20.035Z: Request 14 completed in 280ms
2023-06-29T04:05:20.035Z: Request 15 beginning
2023-06-29T04:05:20.409Z: Request 15 completed in 374ms
2023-06-29T04:05:20.409Z: Request 16 beginning
2023-06-29T04:05:20.691Z: Request 16 completed in 282ms
2023-06-29T04:05:20.691Z: Request 17 beginning
2023-06-29T04:05:21.041Z: Request 17 completed in 349ms
2023-06-29T04:05:21.041Z: Request 18 beginning
2023-06-29T04:05:21.385Z: Request 18 completed in 344ms
2023-06-29T04:05:21.385Z: Request 19 beginning
2023-06-29T04:05:21.611Z: Request 19 completed in 226ms
2023-06-29T04:05:21.611Z: Request 20 beginning
2023-06-29T04:05:22.015Z: Request 20 completed in 403ms
2023-06-29T04:05:22.015Z: Request 21 beginning
2023-06-29T04:05:22.330Z: Request 21 completed in 316ms
2023-06-29T04:05:22.330Z: Request 22 beginning
2023-06-29T04:05:22.696Z: Request 22 completed in 365ms
2023-06-29T04:05:22.696Z: Request 23 beginning
2023-06-29T04:05:22.988Z: Request 23 completed in 292ms
2023-06-29T04:05:22.988Z: Request 24 beginning
2023-06-29T04:05:23.210Z: Request 24 completed in 223ms
2023-06-29T04:05:23.211Z: Request 25 beginning
2023-06-29T04:05:23.630Z: Request 25 completed in 419ms
2023-06-29T04:05:23.630Z: Request 26 beginning
2023-06-29T04:05:24.043Z: Request 26 completed in 413ms
2023-06-29T04:05:24.043Z: Request 27 beginning
2023-06-29T04:05:24.429Z: Request 27 completed in 386ms
2023-06-29T04:05:24.429Z: Request 28 beginning
2023-06-29T04:05:24.727Z: Request 28 completed in 298ms
2023-06-29T04:05:24.727Z: Request 29 beginning
2023-06-29T04:05:24.955Z: Request 29 completed in 228ms
2023-06-29T04:05:24.955Z: Request 30 beginning
2023-06-29T04:05:25.323Z: Request 30 completed in 368ms
2023-06-29T04:05:25.323Z: Request 31 beginning
2023-06-29T04:05:25.529Z: Request 31 completed in 205ms
2023-06-29T04:05:25.529Z: Request 32 beginning
2023-06-29T04:05:25.861Z: Request 32 completed in 332ms
2023-06-29T04:05:25.861Z: Request 33 beginning
2023-06-29T04:05:25.991Z: Request 33 completed in 130ms
2023-06-29T04:05:25.991Z: Request 34 beginning
2023-06-29T04:05:26.278Z: Request 34 completed in 287ms
2023-06-29T04:05:26.278Z: Request 35 beginning
2023-06-29T04:05:26.543Z: Request 35 completed in 265ms
2023-06-29T04:05:26.543Z: Request 36 beginning
2023-06-29T04:05:26.853Z: Request 36 completed in 310ms
2023-06-29T04:05:26.854Z: Request 37 beginning
2023-06-29T04:05:27.276Z: Request 37 completed in 423ms
2023-06-29T04:05:27.276Z: Request 38 beginning
2023-06-29T04:05:27.651Z: Request 38 completed in 375ms
2023-06-29T04:05:27.651Z: Request 39 beginning
2023-06-29T04:05:28.029Z: Request 39 completed in 378ms
2023-06-29T04:05:28.030Z: Request 40 beginning
2023-06-29T04:05:28.171Z: Request 40 completed in 142ms
2023-06-29T04:05:28.171Z: Request 41 beginning
2023-06-29T04:05:28.353Z: Request 41 completed in 182ms
2023-06-29T04:05:28.353Z: Request 42 beginning
2023-06-29T04:05:28.705Z: Request 42 completed in 352ms
2023-06-29T04:05:28.705Z: Request 43 beginning
2023-06-29T04:05:28.993Z: Request 43 completed in 288ms
2023-06-29T04:05:28.993Z: Request 44 beginning
2023-06-29T04:05:29.319Z: Request 44 completed in 326ms
2023-06-29T04:05:29.319Z: Request 45 beginning
2023-06-29T04:05:29.581Z: Request 45 completed in 263ms
2023-06-29T04:05:29.582Z: Request 46 beginning
2023-06-29T04:05:29.953Z: Request 46 completed in 371ms
2023-06-29T04:05:29.953Z: Request 47 beginning
2023-06-29T04:05:30.321Z: Request 47 completed in 368ms
2023-06-29T04:05:30.321Z: Request 48 beginning
2023-06-29T04:05:30.620Z: Request 48 completed in 299ms
2023-06-29T04:05:30.621Z: Request 49 beginning
2023-06-29T04:05:31.046Z: Request 49 completed in 425ms
2023-06-29T04:05:31.046Z: Request 50 beginning
2023-06-29T04:05:31.299Z: Request 50 completed in 253ms
2023-06-29T04:05:31.299Z: Request 51 beginning
  • @aws-sdk/client-s3: "^3.360.0"
  • node v18.16.0
  • macbook m1 pro

This seems very unusual, and concerns me that this could happen in our production servers - with absolutely no trace of why the process has exited.

Hopefully this information helps!
Cheers, Luke.

@github-actions
Copy link

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jul 14, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue is a bug. closed-for-staleness p2 This is a standard priority issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.