Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in Forward over Reverse Mode for Hessian #1628

Closed
dleather opened this issue Jul 10, 2024 · 8 comments
Closed

Error in Forward over Reverse Mode for Hessian #1628

dleather opened this issue Jul 10, 2024 · 8 comments

Comments

@dleather
Copy link

dleather commented Jul 10, 2024

Per request from the Discourse thread.

import DifferentiationInterface as AD
using Enzyme 

#Set initial values and define SpecialFunctions
t0 = 0.0
T = 0.25
θz = 0.5
θy = 2.0
u = 0.1
v = 0.4
z = [v, u, v^2, u*v, u*v, u^2]

function compute_EI1uv_aug(z, t0, T, θz, θy) 
    wu = compute_wu(t0, T, θz, θy)
    wv = compute_wv(t0, T, θz, θy)

    return wu * z[2] + wv * z[1] + 2*z[1]^2
end

function compute_wu(t0, T, θz, θy) 
    if θz  θy
        numerator = -θy - θz * coth((T - t0) * θz) + exp((T - t0) * θy) * θz * csch((T - t0) * θz)
        denominator = (θy - θz) * (θy + θz)
        return numerator / denominator
    else
        numerator = exp((T + t0) * θz) * (exp(2 * t0 * θz) + exp(2 * T * θz) * 
            (-1 + 2 * T * θz - 2 * t0 * θz)) * csch((T - t0) * θz)
        denominator = 4 * θz
        return numerator / denominator
    end
end

function compute_wv(t0, T, θz, θy) 
    if θz  θy
        numerator = exp((T - t0) * θy) * θy - exp((T - t0) * θy) * θz *
            coth((T - t0) * θz) + θz * csch((T - t0) * θz)
        denominator = (θy - θz) * (θy + θz)
        return numerator / denominator
    else
        numerator = exp(-(T + t0) * θz) * (exp((3 * T - t0) * θz) -
            exp((T + t0) * θz) * (1 + 2 * T * θz - 2 * t0 * θz)) * csch((T - t0) * θz)
        denominator = 4 * θz
        return numerator / denominator
    end
end

#Function to differentiate
R(z) = compute_EI1uv_aug(z, t0, T, θz, θy)
R(z) # Output: 0.4050218388870948

backend = AD.AutoEnzyme() #Set back-end

enzyme_grad = AD.gradient(R, backend, z) #Compute gradient
#Output: 6-element Vector{Float64}: 1.7754258975969854, 0.14851479848300594, 0.0, 0.0, 0.0, 0.0
enzyme_hess = AD.hessian(R, backend, z) #Compute hessian

Which gives error "ERROR: Active return values with automatic pullback (differential return value) deduction only supported for floating-like values and not type Any. If mutable memory, please use Duplicated. Otherwise, you can explicitly specify a pullback by using split mode, e.g. autodiff_thunk(ReverseSplitWithPrimal, ...)
Stacktrace:
[1] error(s::String)
@ Base .\error.jl:35
[2] default_adjoint
@ C:\Users\davle.julia\packages\Enzyme\SiyIj\src\compiler.jl:6210 [inlined]
[3] autodiff_deferred(::ReverseMode{false, FFIABI, false}, f::Const{typeof(R)}, ::Type{Active}, args::Duplicated{Vector{…}})
@ Enzyme C:\Users\davle.julia\packages\Enzyme\SiyIj\src\Enzyme.jl:454
[4] autodiff_deferred
@ C:\Users\davle.julia\packages\Enzyme\SiyIj\src\Enzyme.jl:528 [inlined]
[5] gradient(f::Function, backend::DifferentiationInterfaceEnzymeExt.AutoDeferredEnzyme{…}, x::Vector{…}, ::DifferentiationInterface.NoGradientExtras)
@ DifferentiationInterfaceEnzymeExt C:\Users\davle.julia\packages\DifferentiationInterface\ifUK5\ext\DifferentiationInterfaceEnzymeExt\reverse_onearg.jl:121
Some type information was truncated. Use show(err) to see complete types.

@dleather
Copy link
Author

Forgot to mention, declaring all function inputs as types Vector{T} or T, where T <: Real fixes the issue.

@wsmoses wsmoses closed this as completed Jul 11, 2024
@wsmoses
Copy link
Member

wsmoses commented Jul 11, 2024 via email

@gdalle
Copy link
Contributor

gdalle commented Jul 11, 2024

In the future can you please ping me or open an issue on DI when this happens? Otherwise there is very low probability that the bugs get fixed ^^

@gdalle
Copy link
Contributor

gdalle commented Jul 11, 2024

Also do you have an idea why this happens, and why type annotations solve the problem?

@dleather
Copy link
Author

Oh this is just strictly a DI bug. This shouldn’t happen if you use Enzyme directly

On Wed, Jul 10, 2024 at 5:19 PM David Leather @.> wrote: Forgot to mention, declaring all function inputs as types Vector{T} or T, where T <: Real fixes the issue. — Reply to this email directly, view it on GitHub <#1628 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJTUXBBUT6KYP24EAR2H7TZLWQM5AVCNFSM6AAAAABKVWLRLKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMRRGQ4TSNZSGU . You are receiving this because you are subscribed to this thread.Message ID: @.>

Could you show me how you ran it without typing? I actually ran another version using the template provided for FoR mode before using DI and I got the same error.

@wsmoses
Copy link
Member

wsmoses commented Jul 21, 2024

@dleather apologies, I missed your note, but below is an example of using Enzyme to compute the hessian

using Enzyme 

#Set initial values and define SpecialFunctions
t0 = 0.0
T = 0.25
θz = 0.5
θy = 2.0
u = 0.1
v = 0.4
z = [v, u, v^2, u*v, u*v, u^2]

function compute_EI1uv_aug(z, t0, T, θz, θy) 
    wu = compute_wu(t0, T, θz, θy)
    wv = compute_wv(t0, T, θz, θy)

    return wu * z[2] + wv * z[1] + 2*z[1]^2
end

function compute_wu(t0, T, θz, θy) 
    if θz  θy
        numerator = -θy - θz * coth((T - t0) * θz) + exp((T - t0) * θy) * θz * csch((T - t0) * θz)
        denominator = (θy - θz) * (θy + θz)
        return numerator / denominator
    else
        numerator = exp((T + t0) * θz) * (exp(2 * t0 * θz) + exp(2 * T * θz) * 
            (-1 + 2 * T * θz - 2 * t0 * θz)) * csch((T - t0) * θz)
        denominator = 4 * θz
        return numerator / denominator
    end
end

function compute_wv(t0, T, θz, θy) 
    if θz  θy
        numerator = exp((T - t0) * θy) * θy - exp((T - t0) * θy) * θz *
            coth((T - t0) * θz) + θz * csch((T - t0) * θz)
        denominator = (θy - θz) * (θy + θz)
        return numerator / denominator
    else
        numerator = exp(-(T + t0) * θz) * (exp((3 * T - t0) * θz) -
            exp((T + t0) * θz) * (1 + 2 * T * θz - 2 * t0 * θz)) * csch((T - t0) * θz)
        denominator = 4 * θz
        return numerator / denominator
    end
end

#Function to differentiate
R(z) = compute_EI1uv_aug(z, t0, T, θz, θy)
@show R(z) # Output: 0.4050218388870948

function grad(z, t0, T, θz, θy)
    dz = Enzyme.make_zero(z)
    Enzyme.autodiff_deferred(Reverse, compute_EI1uv_aug, Duplicated(z, dz), Const(t0), Const(T), Const(θz), Const(θy))
    return dz
end

enzyme_grad = grad(z, t0, T, θz, θy)
@show enzyme_grad
#Output: 6-element Vector{Float64}: 1.7754258975969854, 0.14851479848300594, 0.0, 0.0, 0.0, 0.0

enzyme_hess = [Enzyme.autodiff(Forward, grad, Duplicated(z, dz), Const(t0), Const(T), Const(θz), Const(θy))[1] for dz in onehot(z)]
@show enzyme_hess

@gdalle
Copy link
Contributor

gdalle commented Jul 29, 2024

@wsmoses do you have any idea what this bug means for DI? The error message is rather obscure to me.

@gdalle
Copy link
Contributor

gdalle commented Oct 11, 2024

The bug has been fixed, presumably with DI's support for Constant arguments: JuliaDiff/DifferentiationInterface.jl#345

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants