Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add inlined methods for ntuple(f, Val{N}) for 0 ≤ N ≤ 15 #21446

Merged
merged 4 commits into from
Apr 26, 2017

Conversation

wsshin
Copy link
Contributor

@wsshin wsshin commented Apr 19, 2017

This PR adds

ntuple{F}(f::F, ::Type{Val{1}}) = (@_inline_meta; (f(1),))
ntuple{F}(f::F, ::Type{Val{2}}) = (@_inline_meta; (f(1), f(2)))
ntuple{F}(f::F, ::Type{Val{3}}) = (@_inline_meta; (f(1), f(2), f(3)))

ntuple{F}(f::F, ::Type{Val{10}}) = (@_inline_meta; (f(1), f(2), f(3), f(4), f(5), f(6), f(7), f(8), f(9), f(10)))

so that the calls to ntuple(f, Val{N}) is inlined for small N's satisfying 1 ≤ N ≤ 10.

@Sacha0
Copy link
Member

Sacha0 commented Apr 19, 2017

@nanosoldier runbenchmarks(ALL, vs=":master")

@ararslan ararslan added collections Data structures holding multiple items, e.g. sets performance Must go faster labels Apr 19, 2017
@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@andyferris
Copy link
Member

Should this go up to Core.Inference.MAX_TUPLE_LEN or whatever it is?

@TotalVerb
Copy link
Contributor

Might as well include the empty case N = 0 too, right?

@andyferris
Copy link
Member

Should this go up to Core.Inference.MAX_TUPLE_LEN or whatever it is?

Hmm... out of curiosity, does anyone know what happened to MAX_TUPLETYPE_LEN, MAX_TUPLE_DEPTH and MAX_TUPLE_SPLAT on v0.6? (And has the behavior for splatting, recursion, etc changed?)

@TotalVerb
Copy link
Contributor

@andyferris They've been moved to a struct; see #18496.

@andyferris
Copy link
Member

Cool - thank you very much! :)

@wsshin wsshin changed the title Add inlined methods for ntuple(f, Val{N}) for 1 ≤ N ≤ 10 Add inlined methods for ntuple(f, Val{N}) for 0 ≤ N ≤ 15 Apr 20, 2017
base/tuple.jl Outdated
@@ -123,6 +123,23 @@ end
_ntuple(f, n) = (@_noinline_meta; ([f(i) for i = 1:n]...))

# inferrable ntuple
ntuple{F}(f::F, ::Type{Val{0}}) = (@_inline_meta; ())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should also aim to use the new syntax
ntuple(f:F, ....) where {T}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True, but I think that change must be applied to the entire code at once later.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not necessarily, we have had PRs to gradually update the syntax ;)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cheers re. using where syntax. (Though IIRC, method specialization on function arguments happens automagically when the function arguments are called directly in the method's body, so chances are type parameter F is unnecessary.) Best!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

even better

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Sacha0, interesting. So, did #20282 (comment) happen because the function argument was not called inside the method body?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Likely yes :).

@wsshin
Copy link
Contributor Author

wsshin commented Apr 20, 2017

Added the cases for N = 0 and 11 ≤ N ≤ 15. Wanted to do this without explicitly writing out those 16 definitions, but couldn't figure out a good way to automatically generate them. Any suggestions?

@pabloferz
Copy link
Contributor

How about this

for N = 0:15
    tup = Expr(:tuple)
    tup.args = [Expr(:call, :f, i) for i = 1:N]
    @eval ntuple{F}(f::F, ::Type{Val{$N}}) = (@_inline_meta; $tup)
end

@andyferris
Copy link
Member

andyferris commented Apr 20, 2017

@pabloferz Can we change that 15 to refer to the corresponding value in Core.Inference? Or is that not possible?

@wsshin
Copy link
Contributor Author

wsshin commented Apr 20, 2017

@pabloferz, I actually tried something similar, but it seems that @eval isn't available during bootstrap.

@andyferris, I also wanted to get 15 from Core.Inference. It seems that we need to access an instance of InstanceParams, but not sure how...

@wsshin
Copy link
Contributor Author

wsshin commented Apr 20, 2017

@pabloferz, FYI, the following is the error I get during build procedure whenever I try to use @eval inside tuple.jl:

$ make -j 8
    JULIA usr/lib/julia/inference.ji
essentials.jl
ctypes.jl
generator.jl
reflection.jl
options.jl
promotion.jl
tuple.jl
error during bootstrap:
UndefVarError(var=:colon)
rec_backtrace at /Users/wsshin/packages/julia/julia-master/src/stackwalk.c:84
record_backtrace at /Users/wsshin/packages/julia/julia-master/src/task.c:239 [inlined]
jl_throw at /Users/wsshin/packages/julia/julia-master/src/task.c:565
jl_undefined_var_error at /Users/wsshin/packages/julia/julia-master/src/rtutils.c:129
jl_get_binding_or_error at /Users/wsshin/packages/julia/julia-master/src/module.c:236
unknown function (ip: 0x10a42034b)
jl_call_fptr_internal at /Users/wsshin/packages/julia/julia-master/src/./julia_internal.h:326 [inlined]
jl_call_method_internal at /Users/wsshin/packages/julia/julia-master/src/./julia_internal.h:345 [inlined]
jl_toplevel_eval_flex at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:587
jl_parse_eval_all at /Users/wsshin/packages/julia/julia-master/src/ast.c:873
jl_load at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:614 [inlined]
jl_load_ at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:621
unknown function (ip: 0x10a41ef5f)
do_call at /Users/wsshin/packages/julia/julia-master/src/interpreter.c:75
eval at /Users/wsshin/packages/julia/julia-master/src/interpreter.c:242
jl_interpret_toplevel_expr at /Users/wsshin/packages/julia/julia-master/src/interpreter.c:34
jl_toplevel_eval_flex at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:575
jl_eval_module_expr at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:203
jl_toplevel_eval_flex at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:478
jl_toplevel_eval_in at /Users/wsshin/packages/julia/julia-master/src/builtins.c:484
unknown function (ip: 0x10a41e863)
do_call at /Users/wsshin/packages/julia/julia-master/src/interpreter.c:75
eval at /Users/wsshin/packages/julia/julia-master/src/interpreter.c:242
jl_interpret_toplevel_expr at /Users/wsshin/packages/julia/julia-master/src/interpreter.c:34
jl_toplevel_eval_flex at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:575
jl_parse_eval_all at /Users/wsshin/packages/julia/julia-master/src/ast.c:873
jl_load at /Users/wsshin/packages/julia/julia-master/src/toplevel.c:614
exec_program at /Users/wsshin/packages/julia/julia-master/usr/bin/julia (unknown line)
true_main at /Users/wsshin/packages/julia/julia-master/usr/bin/julia (unknown line)
main at /Users/wsshin/packages/julia/julia-master/usr/bin/julia (unknown line)

make[1]: *** [/Users/wsshin/packages/julia/julia-master/usr/lib/julia/inference.ji] Error 1
make: *** [julia-inference] Error 2

@vchuravy
Copy link
Member

You can get access to an instance of InferenceParams, but you can't get access to the instance of InferenceParams that is used to currently infer the code.

params = Core.Inference.InferenceParams(typemax(UInt))

Instead of @eval try the combination of eval(quote ... end)

@wsshin
Copy link
Contributor Author

wsshin commented Apr 21, 2017

@vchuravy, thanks for the comments! Unfortunately, replacing @eval with eval(quote ... end) doesn't work, either. However, I think I have found the true cause of the error. The error seems to be caused by the for loop.

If I simply put inside tuple.jl the following for loop

for N = 1:15
end

without a loop body, then I get exactly the same build error starting with error during bootstrap: UndefVarError(var=:colon).

If I replace the for loop with the following while loop

N = 0
while N  15
    N += 1
end

then I get a similar error starting with error during bootstrap: UndefVarError(var=:≤).

So, it seems that looping outside function definition is not allowed during bootstrap.

Any workaround for this?

@TotalVerb
Copy link
Contributor

TotalVerb commented Apr 21, 2017

This early in bootstrap, neither : nor are available (they haven't been defined yet). I suspect <= might also be unavailable. An alternative is to move this code later in bootstrap, i.e. out of tuple.jl.

@pabloferz
Copy link
Contributor

pabloferz commented Apr 21, 2017

The problem is not @eval as it is already defined at this stage, the issue is that there's almost nothing yet available, not even integer arithmetic. One option would be to resort to intrinsics, e.g.

let N = 0
    while sle_int(N, 15)
        tup = Expr(:tuple)
        tup.args = Array{Any,1}(N)
        i = 1
        while sle_int(i, N)
            Core.arrayset(tup.args, Expr(:call, :f, i), i)
            i = add_int(i, 1)
        end
        @eval ntuple{F}(f::F, ::Type{Val{$N}}) = (@_inline_meta; $tup)
        N = add_int(N, 1)
    end
end

@andyferris As things are now, is not possible. We could restore the global constants MAX_TUPLETYPE_LEN and friends and use them as the default parameters for InferenceParams, but even in that case (as @vchuravy points out) that wouldn't necessarily correspond to the actual InferenceParams'stupletype_len parameter used at any point in time.

@TotalVerb
Copy link
Contributor

Since this code is not needed for the rest of bootstrap, it would be better to simply move it to a file later in the bootstrap process.

@pabloferz
Copy link
Contributor

As @TotalVerb said, other option would be moving this to another file even if its weird to have tuple related functions outside tuple.jl.

A third option would be to have the above definition take effect after Base had been defined, i.e.

if isdefined(Main, :Base) 
    N = 0
    while sle_int(N, 15)
        tup = Expr(:tuple)
        tup.args = Array{Any,1}(N)
        i = 1
        while sle_int(i, N)
            Core.arrayset(tup.args, Expr(:call, :f, i), i)
            i = add_int(i, 1)
        end
        @eval ntuple{F}(f::F, ::Type{Val{$N}}) = (@_inline_meta; $tup)
        N = add_int(N, 1)
    end
end

@Sacha0
Copy link
Member

Sacha0 commented Apr 21, 2017

Perhaps the simplest and clearest solution would be to leave the definitions written out explicitly as they are now, and ignore the slight inelegance? :)

@wsshin
Copy link
Contributor Author

wsshin commented Apr 21, 2017

Thank you all for wonderful suggestions!

Personally I would like to follow @Sacha0's opinion.

  • @TotalVerb's suggestion of putting ntuple{f, Val{N}) definitions out of tuple.jl is clever, but it leads to the definitions of ntuple(f, n) in one file and the definitions of ntuple(f, Val{N}) in another file, which doesn't feel ideal.
  • I really like @pabloferz's elegant approach, but I believe simplicity is also important criterion.

So, I will push the version following @Sacha0's suggestion. I hope other suggestions suggested here to be considered in the future versions of tuple.jl.

@Sacha0
Copy link
Member

Sacha0 commented Apr 21, 2017

@nanosoldier runbenchmarks(ALL, vs=":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@Sacha0
Copy link
Member

Sacha0 commented Apr 22, 2017

@nanosoldier runbenchmarks("broadcast" || "tuple", vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@wsshin
Copy link
Contributor Author

wsshin commented Apr 22, 2017

Not sure why the change cause the performance regressions. The only change was the addition of the following lines:

ntuple(f, ::Type{Val{0}}) = (@_inline_meta; ())
ntuple(f, ::Type{Val{1}}) = (@_inline_meta; (f(1),))
ntuple(f, ::Type{Val{2}}) = (@_inline_meta; (f(1), f(2)))
ntuple(f, ::Type{Val{3}}) = (@_inline_meta; (f(1), f(2), f(3)))

ntuple(f, ::Type{Val{15}}) = (@_inline_meta; (f(1), f(2), f(3), f(4), f(5), f(6), f(7), f(8), f(9), f(10), f(11), f(12), f(13), f(14), f(15)))

The tests regressed are the following, and they are either irrelevant to tuples, or a too small test that is easily affected by random fluctuation:

@Sacha0
Copy link
Member

Sacha0 commented Apr 22, 2017

@nanosoldier runbenchmarks("array", vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@Sacha0
Copy link
Member

Sacha0 commented Apr 22, 2017

The most recent nanosoldier results seem suspect (and apart from which all potential regressions so far look like noise), so let's ask nanosoldier again: @nanosoldier runbenchmarks("array", vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@Sacha0
Copy link
Member

Sacha0 commented Apr 23, 2017

The minor ["broadcast", "mix_scalar_tuple", (3, "tup_tup")] potential regression might be real. Though perhaps small enough to ignore, might be worth checking locally?

Copy link
Contributor

@tkelman tkelman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should merge this.

@pabloferz pabloferz merged commit 7eefa45 into JuliaLang:master Apr 26, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
collections Data structures holding multiple items, e.g. sets performance Must go faster
Projects
None yet
Development

Successfully merging this pull request may close these issues.