Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recursive function inferred on 0.6 but not 0.7 #26119

Closed
ararslan opened this issue Feb 19, 2018 · 4 comments
Closed

Recursive function inferred on 0.6 but not 0.7 #26119

ararslan opened this issue Feb 19, 2018 · 4 comments
Labels
compiler:inference Type inference regression Regression in behavior compared to a previous version

Comments

@ararslan
Copy link
Member

Consider the following code for computing partial correlations:

function partialcor(X::AbstractVector, Y::AbstractVector, Z::AbstractMatrix)
    p = size(Z, 2)
    if p == 1
        return partialcor(X, Y, vec(Z))
    end
    Z₀ = view(Z, :, 1)
    ZmZ₀ = view(Z, :, 2:p)
    rxy = partialcor(X, Y, ZmZ₀)
    rxz = partialcor(X, Z₀, ZmZ₀)
    rzy = partialcor(Z₀, Y, ZmZ₀)
    return (rxy - rxz * rzy) / (sqrt(1 - rxz^2) * sqrt(1 - rzy^2))
end

function partialcor(X::AbstractVector, Y::AbstractVector, Z::AbstractVector)
    rxy = cor(X, Y)
    rxz = cor(X, Z)
    rzy = cor(Z, Y)
    return (rxy - rxz * rzy) / (sqrt(1 - rxz^2) * sqrt(1 - rzy^2))
end

On 0.6, this is inferred just fine:

julia> using Base.Test

julia> X = rand(10, 4);

julia> @inferred partialcor(X[:,1], X[:,2], X[:,3:4])
-0.2837674355125281

On current master, things aren't so peachy:

julia> using Test

julia> X = rand(10, 4);

julia> @inferred partialcor(X[:,1], X[:,2], X[:,3:4])
ERROR: return type Float64 does not match inferred return type Any
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] top-level scope

Now, what's particularly intriguing about this example is what exactly isn't getting inferred:

julia> @code_warntype partialcor(X[:,1], X[:,2], X[:,3:4])
Variables:
  X::Array{Float64,1}
  Y::Array{Float64,1}
  Z::Array{Float64,2}
  p<optimized out>
  Z₀<optimized out>
  ZmZ₀<optimized out>
  rxy::Any
  rxz::Float64
  rzy::Float64

Body:
  begin
      ...

The variable rxy is inferred to be Any, rather than Float64 like its r* buddies. This inferential confusion propagates through and the result is then inferred as Any. But if we rearrange the calls in the first method and add a type assert, things are okay again:

function partialcor(X::AbstractVector, Y::AbstractVector, Z::AbstractMatrix)
    # Same stuff as before
    rxz = partialcor(X, Z₀, ZmZ₀)
    rzy = partialcor(Z₀, Y, ZmZ₀)
    rxy = partialcor(X, Y, ZmZ₀)::typeof(rxz)
    # ...

Then both rxy and the final result are correctly inferred.

@ararslan ararslan added regression Regression in behavior compared to a previous version compiler:inference Type inference labels Feb 19, 2018
@vtjnash
Copy link
Member

vtjnash commented Jul 10, 2018

working as intended (#21933)

@vtjnash vtjnash closed this as completed Jul 10, 2018
@ararslan
Copy link
Member Author

I don't get it. So this regression was intentionally introduced in that PR?

@JeffBezanson
Copy link
Member

I don't know about this specific case, but yes, inferred types can sometimes get weaker. I often bring this up in debates on return_type and nobody seems to believe me, but it's true.

@StefanKarpinski
Copy link
Member

Not only that, tighter inferred types are not always better: the optimal inferrer would know exactly when concrete types matter and when they don't and not infer anything tighter than Any in the latter case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference regression Regression in behavior compared to a previous version
Projects
None yet
Development

No branches or pull requests

4 participants