You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are some common "mistakes" (or missed opportunities) that lead to increased heap size:
reading in a large chunk of memory and then referencing just some bits of it via []byte/string (subject to detaching)
producing lots of duplicate strings after reading in some data (subject to interning)
alignment padding in structs (subject to field reordering)
maybe we can find more opportunities
Potentially runtime could do an expensive analysis (for simplicity can start with STW) and dump a profile with these missed opportunities. Namely, run heap marking on intra-object level (mark/scan only reachable parts of objects); run full heap duplication analysis (hash by object size/type/contents) and then cross this with heap profile data and dump intersection.
This may provide a very cost-efficient way to optimize heap size. E.g. this stack allocated N GB that are 99% unreachable, or this stack allocated M GB of duplicate strings, etc.
The text was updated successfully, but these errors were encountered:
I'd be uneasy about exposing something so expensive to collect (including a STW that's proportional to something other than the number of Ps) as a runtime/pprof.Profile.
Proposal Details
There are some common "mistakes" (or missed opportunities) that lead to increased heap size:
[]byte
/string
(subject to detaching)Potentially runtime could do an expensive analysis (for simplicity can start with STW) and dump a profile with these missed opportunities. Namely, run heap marking on intra-object level (mark/scan only reachable parts of objects); run full heap duplication analysis (hash by object size/type/contents) and then cross this with heap profile data and dump intersection.
This may provide a very cost-efficient way to optimize heap size. E.g. this stack allocated N GB that are 99% unreachable, or this stack allocated M GB of duplicate strings, etc.
The text was updated successfully, but these errors were encountered: