-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
precompile+ #15934
precompile+ #15934
Conversation
@@ -93,26 +103,24 @@ function find_in_path(name::AbstractString, wd = pwd()) | |||
if wd !== nothing | |||
isfile_casesensitive(joinpath(wd,name)) && return joinpath(wd,name) | |||
end | |||
for prefix in [Pkg.dir(); LOAD_PATH] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@JeffBezanson compiling this vcat
seems to be a difficult problem for inference to pre-generate, so i've rewritten in a way that generates less code (and thus saves a bit of time, multiplied by every precompiled module startup)
ff61785
to
ad37021
Compare
Gulp. Increases sys.so by 20%. |
yes. but, just think of the savings of not creating that same |
With this change, test-numbers takes ~130 seconds. Disabling the extra precompile brings it back to ~90 seconds. I tried filtering out types containing DataType in jl_get_specialization1 but no luck so far. |
# `wd` is a working directory to search. defaults to current working directory. | ||
# if `wd === nothing`, no extra path is searched. | ||
function find_in_path(name::AbstractString, wd = pwd()) | ||
function find_in_path(name::ByteString, wd = pwd()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this caused JuliaLang/PkgDev.jl#38 (comment) and needs a converting fallback
previously, a cache entry would be widened beyond the despecialization heuristic, creating an ordering dependence such that after inserting a Function into the cache (as Any) or a Tuple (as DataType), no more methods would attempt to specialize that slot should fix the issue noted #15934 (comment)
previously, a cache entry would be widened beyond the despecialization heuristic, creating an ordering dependence such that after inserting a Function into the cache (as Any) or a Tuple (as DataType), no more methods would attempt to specialize that slot should fix the issue noted #15934 (comment)
this make the precompile.jl file more effective. previously,
jl_get_specialization1
would get called on the toplevel function and any direct leaf call, recursively. this extends that to make sure it gets called on anything inference thought might get called.the primary relevant improvement for
loading.jl
was that a ByteString return type didn't transfer a high compile-time penalty into runtime.