Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase max_semi_space #67

Open
ronag opened this issue Mar 22, 2023 · 10 comments
Open

Increase max_semi_space #67

ronag opened this issue Mar 22, 2023 · 10 comments
Labels
help wanted Extra attention is needed

Comments

@ronag
Copy link
Member

ronag commented Mar 22, 2023

See, nodejs/node#42511.

@ronag
Copy link
Member Author

ronag commented Mar 22, 2023

Was discussed on TSC meeting and we would love to have a PR to discuss.

@ronag ronag added the help wanted Extra attention is needed label Mar 22, 2023
@mcollina
Copy link
Member

In my experience, this can lead to reduced memory consumption for most server-side workloads because fewer objects will be moved to old space.

@anonrig
Copy link
Member

anonrig commented Mar 22, 2023

I'll like to take a look at this. Does anybody have any extra knowledge on this, in case some help is needed?

@debadree25
Copy link
Member

Would setting the option value in node.cc like this be the way to go? or there a better place to put this

diff --git a/src/node.cc b/src/node.cc
index 2aba7333d9..e1e0a6b924 100644
--- a/src/node.cc
+++ b/src/node.cc
@@ -779,6 +779,8 @@ static ExitCode InitializeNodeWithArgsInternal(
   // is security relevant, for Node it's less important.
   V8::SetFlagsFromString("--no-freeze-flags-after-init");
 
+  V8::SetFlagsFromString("--max_semi_space_size=128");
+
 #if defined(NODE_V8_OPTIONS)
   // Should come before the call to V8::SetFlagsFromCommandLine()
   // so the user can disable a flag --foo at run-time by passing

@BridgeAR
Copy link
Member

Let's maybe start with 64mb as it indicates less peak memory increase while still showing a significant performance improvement overall?

@debadree25
Copy link
Member

debadree25 commented Mar 27, 2023

sure, ran the tests again seems like 64 mb is the sweet spot actually

for 128mb

debadreechatterjee@Debadree-TagMango-MacBook-Pro web-tooling-benchmark % /Users/debadreechatterjee/Documents/Personal/node/node-perf dist/cli.js
Running Web Tooling Benchmark v0.5.3…
-------------------------------------
         acorn: 14.11 runs/s
         babel: 10.97 runs/s
  babel-minify: 16.76 runs/s
       babylon: 20.19 runs/s
         buble:  6.60 runs/s
          chai: 18.82 runs/s
  coffeescript: 10.61 runs/s
        espree:  5.38 runs/s
       esprima: 16.88 runs/s
        jshint: 15.18 runs/s
         lebab: 15.31 runs/s
       postcss:  9.26 runs/s
       prepack:  9.91 runs/s
      prettier: 10.38 runs/s
    source-map: 10.87 runs/s
        terser: 29.18 runs/s
    typescript: 15.18 runs/s
     uglify-js:  7.73 runs/s
-------------------------------------
Geometric mean: 12.45 runs/s

and for 64mb

debadreechatterjee@Debadree-TagMango-MacBook-Pro web-tooling-benchmark % /Users/debadreechatterjee/Documents/Personal/node/node-perf --max-semi-space-size=64 dist/cli.js 
Running Web Tooling Benchmark v0.5.3…
-------------------------------------
         acorn: 16.04 runs/s
         babel: 11.32 runs/s
  babel-minify: 16.83 runs/s
       babylon: 18.43 runs/s
         buble:  7.23 runs/s
          chai: 19.44 runs/s
  coffeescript: 11.17 runs/s
        espree:  5.46 runs/s
       esprima: 19.62 runs/s
        jshint: 17.77 runs/s
         lebab: 16.95 runs/s
       postcss:  8.80 runs/s
       prepack: 11.63 runs/s
      prettier: 10.41 runs/s
    source-map: 11.36 runs/s
        terser: 29.36 runs/s
    typescript: 14.97 runs/s
     uglify-js:  8.74 runs/s
-------------------------------------
Geometric mean: 13.12 runs/s

Just needed to know if that was the right place to put this change since am new to the cpp part of the codebase 😅

@webcarrot
Copy link

webcarrot commented May 3, 2023

I'm not sure that is safe to do:

I was trying to use --max-semi-space-size=64 for one of the services. After ~ week it (GC?) just blowup during high traffic:

Mem

P1:

Zrzut ekranu 2023-05-3 o 13 31 06

  • yellow requested "new memory"
  • green used "new memory"

P2:

Zrzut ekranu 2023-05-3 o 13 30 49

  • orange requested "new memory"
  • blue used "new memory"

As we can see, at some point v8 started requesting 128MiB (rather than 64MiB) of memory for the "new" stuff, while not being able to handle and use it.

Processes started to use multiple cpus/cores (3 cores at 100% each):

Zrzut ekranu 2023-05-3 o 13 41 10

  • yellow P1
  • green P1

The funny fact is that these services were still able to handl traffic without any issues.

@debadree25
Copy link
Member

debadree25 commented May 3, 2023

I think presently the flag --max-semi-space-size=64 just directly sets semi-space to 64mb I think from the basic stuff I understood from reading that part of v8 code we need to change the partition between young gen and old gen space (and semi-space as a proportion of young space, directly setting it causes OOM error, in the PR also nodejs/node#47277 the CI failures are all OOM related tests, (although i havent yet fully investigated all the nitty grities here and the pr is still pending 😅, feel free to correct me if i said something wrong)

@joebowbeer
Copy link

The node docs already explicitly mention setting max-old-space-size, and this is essential for efficient use of memory in single-process server applications. The reason this setting isn't baked in is because of the qualifications, such as single-process, and because YMMV.

IRC, max-semi-space-size will be derived from max-old-space-size.

If anything is done wrt max-semi-space-size, I suggest first adding a recommendation to the docs to accompany (and complement) the max-old-space-size recommendation.

@joebowbeer
Copy link

joebowbeer commented Oct 22, 2024

In the version of v8 (11.3) used in Node 20, the default max-semi-space-size is no longer a constant (16MB),
and instead scales with the memory limit.

The documentation has been updated:

nodejs/node#55812

This can have an undesirable performance impact when upgrading from Node 18 to Node 20:

https://blog.ztec.fr/en/2024/post/node.js-20-upgrade-journey-though-unexpected-heap-issues-with-kubernetes/

In Node 20, max-semi-space-size depends on the memory limit and its default size will be less than 16MB for memory limits up to 2GB, inclusive.

For example, a memory limit of 512 MB will default to 1 MB of semi space in v20, whereas in v18 the default for 64-bit systems is 16 MB.

Without an explicit semi-space-size in node:20, the young generation in this case is only 3MB

docker run -it -m 512m \
  -e NODE_OPTIONS="--max-old-space-size=384" \
  node:20 node -e 'console.log(v8.getHeapStatistics().heap_size_limit/(1024*1024))'
387

387-384 = 3MB = 3 x max-semi-space-size (1MB)

Configuring semi-space-size, below, for same young generation size (48MB) as node:18

docker run -it -m 512m \
  -e NODE_OPTIONS="--max-old-space-size=384 --max-semi-space-size=16" \
  node:20 node -e 'console.log(v8.getHeapStatistics().heap_size_limit/(1024*1024))'
432

432-384 = 48MB = 3 x max-semi-space-size (16MB)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

7 participants