Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

contributing: buildup failing on Gitpod #24932

Closed
lassiraa opened this issue Apr 4, 2023 · 1 comment · Fixed by #24905
Closed

contributing: buildup failing on Gitpod #24932

lassiraa opened this issue Apr 4, 2023 · 1 comment · Fixed by #24905
Labels
aws-cdk-lib Related to the aws-cdk-lib package bug This issue is a bug.

Comments

@lassiraa
Copy link
Contributor

lassiraa commented Apr 4, 2023

Describe the bug

When launching a gitpod instance with aws-cdk, the buildup fails at the aws-cdk-lib package.

Expected Behavior

Successful buildup when starting up Gitpod.

Current Behavior

Error with following JS stacktrace:

<--- Last few GCs --->

[1947:0x51be1d0] 219838 ms: Mark-sweep (reduce) 4035.7 (4105.8) -> 4034.8 (4106.1) MB, 2600.4 / 0.0 ms (average mu = 0.081, current mu = 0.008) allocation failure scavenge might not succeed
[1947:0x51be1d0] 222069 ms: Mark-sweep (reduce) 4035.9 (4103.1) -> 4035.2 (4104.3) MB, 2024.8 / 0.0 ms (average mu = 0.086, current mu = 0.092) allocation failure scavenge might not succeed

<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0xa3ad50 node::Abort() [/usr/bin/node]
2: 0x970199 node::FatalError(char const*, char const*) [/usr/bin/node]
3: 0xbba90e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/bin/node]
4: 0xbbac87 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/bin/node]
5: 0xd76ea5 [/usr/bin/node]
6: 0xd77a2f [/usr/bin/node]
7: 0xd8586b v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/usr/bin/node]
8: 0xd8942c v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/usr/bin/node]
9: 0xd57b0b v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/usr/bin/node]
10: 0x10a015f v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/usr/bin/node]
11: 0x1449379 [/usr/bin/node]
Aborted
Error: /workspace/aws-cdk/tools/@aws-cdk/cdk-build-tools/node_modules/jsii/bin/jsii --silence-warnings=reserved-word --add-deprecation-warnings --compress-assembly '--strip-deprecated /workspace/aws-cdk/deprecated_apis.txt' exited with error code 134
Build failed.!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
error: last command failed. fix problem and resume by executing: /workspace/aws-cdk/scripts/foreach.sh
directory: /workspace/aws-cdk/packages/aws-cdk-lib

Reproduction Steps

Going to Contributing and starting up a Gitpod instance leads to the issue.

Possible Solution

The error is some sort of out of memory error, so could the issue be due to the limited amount of memory on the Gitpod instances, or perhaps something worse? I've tried using the 8 GB and 16 GB RAM instances with the same end results. Obviously the current builds are succeeding, so I am guessing it's something Gitpod specific in this case.

Additional Information/Context

No response

CDK CLI Version

Framework Version

No response

Node.js Version

14.21.3

OS

Ubuntu 22.04.2 LTS

Language

Typescript

Language Version

No response

Other information

No response

@lassiraa lassiraa added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Apr 4, 2023
@github-actions github-actions bot added the aws-cdk-lib Related to the aws-cdk-lib package label Apr 4, 2023
@khushail khushail added the needs-reproduction This issue needs reproduction. label Apr 4, 2023
@khushail khushail self-assigned this Apr 4, 2023
@khushail khushail added the investigating This issue is being investigated and/or work is in progress to resolve the issue. label Apr 4, 2023
@mergify mergify bot closed this as completed in #24905 Apr 5, 2023
mergify bot pushed a commit that referenced this issue Apr 5, 2023
…#24905)

This follows the pattern in #24425 which seems to especially happen
after #24376 when running these scripts.

Closes #24932

----

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
@github-actions
Copy link

github-actions bot commented Apr 5, 2023

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

@pahud pahud removed needs-triage This issue or PR still needs to be triaged. investigating This issue is being investigated and/or work is in progress to resolve the issue. needs-reproduction This issue needs reproduction. labels Apr 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
aws-cdk-lib Related to the aws-cdk-lib package bug This issue is a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants