-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
callsite inlining interacts very badly with source deletion #42078
Labels
compiler:optimizer
Optimization passes (mostly in base/compiler/ssair/)
Comments
aviatesk
added
the
compiler:optimizer
Optimization passes (mostly in base/compiler/ssair/)
label
Sep 1, 2021
aviatesk
added a commit
that referenced
this issue
Sep 1, 2021
After #41328, inference can observe statement flags and try to re-infer a discarded source if it's going to be inlined. The re-inferred source will only be cached into the inference-local cache, and won't be cached globally.
aviatesk
added a commit
that referenced
this issue
Sep 2, 2021
After #41328, inference can observe statement flags and try to re-infer a discarded source if it's going to be inlined. The re-inferred source will only be cached into the inference-local cache, and won't be cached globally.
aviatesk
added a commit
that referenced
this issue
Sep 3, 2021
After #41328, inference can observe statement flags and try to re-infer a discarded source if it's going to be inlined. The re-inferred source will only be cached into the inference-local cache, and won't be cached globally.
LilithHafner
pushed a commit
to LilithHafner/julia
that referenced
this issue
Feb 22, 2022
…liaLang#42082) After JuliaLang#41328, inference can observe statement flags and try to re-infer a discarded source if it's going to be inlined. The re-inferred source will only be cached into the inference-local cache, and won't be cached globally.
LilithHafner
pushed a commit
to LilithHafner/julia
that referenced
this issue
Mar 8, 2022
…liaLang#42082) After JuliaLang#41328, inference can observe statement flags and try to re-infer a discarded source if it's going to be inlined. The re-inferred source will only be cached into the inference-local cache, and won't be cached globally.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
There are several points in our codegen pipeline that discard the inferred source code that is not "inlineable",
and it easily breaks the idempotency of callsite inlining, e.g.:
We can keep the idempotency If we turn off following lines:
julia/src/codegen.cpp
Line 7743 in 2a0ab37
julia/base/compiler/types.jl
Line 216 in 2a0ab37
but of course this leads to the fat sysimages...:
So that actually means, we really want "re-inference" stuff here after all ?
julia/base/compiler/optimize.jl
Lines 38 to 44 in 2a0ab37
The text was updated successfully, but these errors were encountered: