-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add a cache to reuse identical anonymous functions in the REPL #21113
Comments
CC @dlfivefifty. |
Can you compare the timings with Julia 0.5.1? There might be a regression in compilation time too (meaning that it could be faster). |
I know it's only an example, but it will help to use
Wrapping functions in that will avoid specializing code for each function. |
I just tried the struct trick, and it works very well. Thanks @JeffBezanson. |
…n the REPL) This is an implementation of @JeffBezanson ’s suggestion in JuliaLang/julia#21113 Constructors are fast again.
Thank you very much @JeffBezanson. Of course future improvements will always be welcome, but I've implemented the julia> versioninfo()
Julia Version 0.6.0-pre.alpha.173
Commit 93cddfc* (2017-03-18 17:19 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin16.4.0)
CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
WORD_SIZE: 64
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.9.1 (ORCJIT, haswell)
julia> using ApproxFun # branch: infinity-is-a-free-man
julia> @time Fun(x->exp(x))
4.434731 seconds (5.99 M allocations: 262.131 MiB, 2.76% gc time)
Fun(Chebyshev(【-1.0,1.0】),[1.26607, 1.13032, 0.271495, 0.0443368, 0.00547424, 0.000542926, 4.49773e-5, 3.19844e-6, 1.99212e-7, 1.10368e-8, 5.5059e-10, 2.49797e-11, 1.03911e-12, 3.99195e-14])
julia> @time Fun(x->exp(x))
0.004825 seconds (816 allocations: 38.563 KiB)
Fun(Chebyshev(【-1.0,1.0】),[1.26607, 1.13032, 0.271495, 0.0443368, 0.00547424, 0.000542926, 4.49773e-5, 3.19844e-6, 1.99212e-7, 1.10368e-8, 5.5059e-10, 2.49797e-11, 1.03911e-12, 3.99195e-14])
With a generic function: julia> g = x->exp(x)
(::#5) (generic function with 1 method)
julia> @time Fun(g)
0.005058 seconds (799 allocations: 37.367 KiB)
Fun(Chebyshev(【-1.0,1.0】),[1.26607, 1.13032, 0.271495, 0.0443368, 0.00547424, 0.000542926, 4.49773e-5, 3.19844e-6, 1.99212e-7, 1.10368e-8, 5.5059e-10, 2.49797e-11, 1.03911e-12, 3.99195e-14])
julia> @time Fun(g)
0.000681 seconds (485 allocations: 18.094 KiB)
Fun(Chebyshev(【-1.0,1.0】),[1.26607, 1.13032, 0.271495, 0.0443368, 0.00547424, 0.000542926, 4.49773e-5, 3.19844e-6, 1.99212e-7, 1.10368e-8, 5.5059e-10, 2.49797e-11, 1.03911e-12, 3.99195e-14])
The change in run-time is negligible (in the perceived experience.) The more complicated bivariate constructors are substantially faster again and can work with generic functions. (I forgot to show julia> using ApproxFun # branch: infinity-is-a-free-man
julia> using SingularIntegralEquations # branch: development
julia> @time G = GreensFun((x,y)->1/2,CauchyWeight(Chebyshev()⊗Chebyshev(),0))
12.237095 seconds (16.23 M allocations: 722.966 MiB, 2.69% gc time)
GreensFun with kernels:
{
LowRankFun on π⁻¹log|y-x|[Chebyshev(【-1.0,1.0】)⊗Chebyshev(【-1.0,1.0】)] of rank 1.
}
julia> @time G = GreensFun((x,y)->1/2,CauchyWeight(Chebyshev()⊗Chebyshev(),0))
0.005827 seconds (31.43 k allocations: 615.939 KiB)
GreensFun with kernels:
{
LowRankFun on π⁻¹log|y-x|[Chebyshev(【-1.0,1.0】)⊗Chebyshev(【-1.0,1.0】)] of rank 1.
}
|
Caching identical anonymous functions is something it would be nice to have in many contexts, e.g. the dot-call broadcasting generates lots of anonymous functions. |
Although it's undecidable to compute whether two functions compute the same thing, deciding that two functions are syntactically equivalent and refer to the same global objects should be possible. |
Renaming this to reflect that caching functions is the main to-do here. |
I think the issue is slightly broader than caching, as in interactive use one uses a lot of different anonymous functions. While your It would be nice if you could flip a switch, something like |
How about this @cached macro (based on MikeInnes's fn macro)?
|
We should use some code like that to automate this in the REPL. If you're willing to type |
The
ApproxFun
andSingularIntegralEquations
packages were built in-part to give users the feeling they are "numerically computing with functions," to quote Nick Trefethen.When they started in Julia v0.3, generic functions did not require a re-compilation of
Fun
constructors. Since Julia v0.5, generic functions are now their own types and the constructors require re-compilation.In certain cases, the re-compilation is not so bad:
Of course, declaring a generic function, we see that the run-time calculation is actually very fast:
On the other hand, for more complicated constructors, such as bivariate constructors in
ApproxFun
andSingularIntegralEquations
, the compile-time is now beyond interesting:but again, the run-time is actually quite fast.
I'd like to know what the plan is for this scenario -- certainly the brilliant minds behind Julia have thought of this. With compilation times this high (substantially higher than when the
ApproxFun
andSingularIntegralEquations
packages started), it's hard to call this aspect of the current rendition of functions in Julia adynamic
experience.The text was updated successfully, but these errors were encountered: