Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Task switch error on Enzyme v0.13 #2081

Open
MilesCranmer opened this issue Nov 11, 2024 · 35 comments
Open

Task switch error on Enzyme v0.13 #2081

MilesCranmer opened this issue Nov 11, 2024 · 35 comments

Comments

@MilesCranmer
Copy link
Contributor

When trying to use Enzyme as the autodiff backend for SymbolicRegression searches I ran into this error:

        nested task error: task switch not allowed from inside staged nor pure functions

The full stack trace:

┌ Error: Problem fitting the machine machine(SRRegressor(defaults = nothing, ), ). 
└ @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:694
[ Info: Running type checks... 
[ Info: Type checks okay. 
ERROR: LoadError: TaskFailedException
Stacktrace:
  [1] wait(t::Task)
    @ Base ./task.jl:370
  [2] fetch
    @ ./task.jl:390 [inlined]
  [3] _main_search_loop!(state::SymbolicRegression.SearchUtilsModule.SearchState{…}, datasets::Vector{…}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{…}, options::Options{…})
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:833
  [4] _equation_search(datasets::Vector{…}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{…}, options::Options{…}, saved_state::Nothing)
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:535
  [5] equation_search(datasets::Vector{…}; options::Options{…}, saved_state::Nothing, runtime_options::Nothing, runtime_options_kws::@Kwargs{})
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:525
  [6] equation_search(X::Matrix{…}, y::Matrix{…}; niterations::Int64, weights::Nothing, options::Options{…}, variable_names::Vector{…}, display_variable_names::Vector{…}, y_variable_names::Nothing, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, run_id::Nothing, loss_type::Type{…}, verbosity::Int64, logger::Nothing, progress::Nothing, X_units::Nothing, y_units::Nothing, extra::@NamedTuple{}, v_dim_out::Val{…}, multithreaded::Nothing)
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:476
  [7] #equation_search#34
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:499 [inlined]
  [8] _update(m::SRRegressor{…}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{}, y::Vector{…}, w::Nothing, options::Options{…}, class::Vector{…})
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:253
  [9] _update(m::SRRegressor{…}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{}, y::Vector{…}, w::Nothing, options::Options{…}, class::Nothing)
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:220
 [10] update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{…}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{…}, x2::Vector{…}, class::Vector{…}}, y::Vector{Float64}, w::Nothing)
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:201
 [11] fit
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189 [inlined]
 [12] fit(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64})
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189
 [13] fit_only!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{…}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{…}, DataType}, true}; rows::Nothing, verbosity::Int64, force::Bool, composite::Nothing)
    @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:692
 [14] fit_only!
    @ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:617 [inlined]
 [15] #fit!#63
    @ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:789 [inlined]
 [16] fit!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true})
    @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:786
 [17] top-level scope
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:103
 [18] include(fname::String)
    @ Main ./sysimg.jl:38
 [19] top-level scope
    @ REPL[3]:1

    nested task error: TaskFailedException
    Stacktrace:
     [1] wait(t::Task)
       @ Base ./task.jl:370
     [2] fetch
       @ ./task.jl:390 [inlined]
     [3] (::SymbolicRegression.var"#89#94"{SymbolicRegression.SearchUtilsModule.SearchState{}, Int64, Int64})()
       @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:810
    
        nested task error: task switch not allowed from inside staged nor pure functions
        Stacktrace:
          [1] try_yieldto(undo::typeof(Base.ensure_rescheduled))
            @ Base ./task.jl:948
          [2] wait()
            @ Base ./task.jl:1022
          [3] wait(c::Base.GenericCondition{Base.Threads.SpinLock}; first::Bool)
            @ Base ./condition.jl:130
          [4] wait
            @ ./condition.jl:125 [inlined]
          [5] (::Base.var"#slowlock#733")(rl::ReentrantLock)
            @ Base ./lock.jl:157
          [6] lock
            @ ./lock.jl:147 [inlined]
          [7] cached_compilation
            @ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8412 [inlined]
          [8] thunkbase(ctx::LLVM.Context, mi::Core.MethodInstance, ::Val{…}, ::Type{…}, ::Type{…}, tt::Type{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Type{…}, ::Val{…}, ::Val{…})
            @ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8548
          [9] #s2104#19135
            @ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8685 [inlined]
         [10] 
            @ Enzyme.Compiler ./none:0
         [11] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
            @ Core ./boot.jl:707
         [12] autodiff
            @ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:473 [inlined]
         [13] autodiff
            @ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:537 [inlined]
         [14] autodiff
            @ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:504 [inlined]
         [15] (::SymbolicRegression.ConstantOptimizationModule.GradEvaluator{…})(::Float64, G::Vector{…}, x::Vector{…})
            @ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
         [16] (::NLSolversBase.var"#69#70"{NLSolversBase.InplaceObjective{}, Float64})(G::Vector{Float64}, x::Vector{Float64})
            @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/objective_types/incomplete.jl:54
         [17] value_gradient!!(obj::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
            @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
         [18] initial_state(method::Optim.BFGS{…}, options::Optim.Options{…}, d::NLSolversBase.OnceDifferentiable{…}, initial_x::Vector{…})
            @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/first_order/bfgs.jl:94
         [19] optimize
            @ ~/.julia/packages/Optim/ZhuZN/src/multivariate/optimize/optimize.jl:36 [inlined]
         [20] optimize(f::NLSolversBase.InplaceObjective{…}, initial_x::Vector{…}, method::Optim.BFGS{…}, options::Optim.Options{…}; inplace::Bool, autodiff::Symbol)
            @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/optimize/interface.jl:143
         [21] optimize
            @ ~/.julia/packages/Optim/ZhuZN/src/multivariate/optimize/interface.jl:139 [inlined]
         [22] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:76 [inlined]
         [23] _optimize_constants(dataset::Dataset{…}, member::PopMember{…}, options::Options{…}, algorithm::Optim.BFGS{…}, optimizer_options::Optim.Options{…}, idx::Nothing)
            @ SymbolicRegression.ConstantOptimizationModule ~/.julia/packages/DispatchDoctor/ZmxWH/src/stabilization.jl:314
         [24] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:46 [inlined]
         [25] dispatch_optimize_constants(dataset::Dataset{…}, member::PopMember{…}, options::Options{…}, idx::Nothing)
            @ SymbolicRegression.ConstantOptimizationModule ~/.julia/packages/DispatchDoctor/ZmxWH/src/stabilization.jl:314
         [26] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:27 [inlined]
         [27] optimize_constants(dataset::Dataset{…}, member::PopMember{…}, options::Options{…})
            @ SymbolicRegression.ConstantOptimizationModule ~/.julia/packages/DispatchDoctor/ZmxWH/src/stabilization.jl:314
         [28] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:118 [inlined]
         [29] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/Utils.jl:159 [inlined]
         [30] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:109 [inlined]
         [31] optimize_and_simplify_population(dataset::Dataset{…}, pop::Population{…}, options::Options{…}, curmaxsize::Int64, record::Dict{…})
            @ SymbolicRegression.SingleIterationModule ~/.julia/packages/DispatchDoctor/ZmxWH/src/stabilization.jl:314
         [32] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:1087 [inlined]
         [33] _dispatch_s_r_cycle(in_pop::Population{…}, dataset::Dataset{…}, options::Options{…}; pop::Int64, out::Int64, iteration::Int64, verbosity::Int64, cur_maxsize::Int64, running_search_statistics::SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics)
            @ SymbolicRegression ~/.julia/packages/DispatchDoctor/ZmxWH/src/stabilization.jl:314
         [34] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:762 [inlined]
         [35] (::SymbolicRegression.var"#86#88"{})()
            @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SearchUtils.jl:263
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:103
Some type information was truncated. Use `show(err)` to see complete types.

To reproduce, you can run the example here: https://ai.damtp.cam.ac.uk/symbolicregression/dev/examples/parameterized_function/ and swap :Zygote for :Enzyme.

For example:

using SymbolicRegression
using Random: MersenneTwister
using Enzyme
using MLJBase: machine, fit!, predict, report
using Test

X = let rng = MersenneTwister(0), n = 30
    (; x1=randn(rng, n), x2=randn(rng, n), class=rand(rng, 1:2, n))
end

y = let P1 = [0.1, 1.5], P2 = [3.2, 0.5]
    [2 * cos(x2 + P1[class]) + x1^2 - P2[class] for (x1, x2, class) in zip(X.x1, X.x2, X.class)]
end

model = SRRegressor(;
    niterations=100,
    binary_operators=[+, *, /, -],
    unary_operators=[cos, exp],
    populations=30,
    expression_type=ParametricExpression,
    expression_options=(; max_parameters=2),
    autodiff_backend=:Enzyme,
);

mach = machine(model, X, y)
fit!(mach)

Could it be because I am running Enzyme.jl from a task within the code?

More context: Enzyme.jl used to work for this, and I don't think I changed anything that would cause new behavior on my end. But I just switched to v0.13 so not sure if something changed. I can't test v0.12 due to the error here: #2080

@wsmoses
Copy link
Member

wsmoses commented Nov 12, 2024

@vchuravy this looks like a problem of julia not liking the locks of the caching mechanism all of a sudden??

I have no idea why

@wsmoses
Copy link
Member

wsmoses commented Nov 18, 2024

@MilesCranmer from the looks of it, it seems like this related to doing a first compile in a task

Separately, we've been consistently running the itnegration CI you made a while ago, and things continue to pass. Presumably that is set to the version at the time, so perhaps you can find what on your end changed to cause the issue?

@MilesCranmer
Copy link
Contributor Author

The main difference is that the first compile on my system is in a worker thread, whereas in the CI, it’s the main thread. Nothing has changed on my side though.

@MilesCranmer
Copy link
Contributor Author

Friendly ping on this

@wsmoses
Copy link
Member

wsmoses commented Nov 28, 2024

This issue is a bit out of my wheelhouse. @vchuravy's help will probably be needed here

@vchuravy
Copy link
Member

@MilesCranmer it looks like you are calling autodiff from within the generator of a generated function?

@vchuravy
Copy link
Member

Ah, no...

Enzyme uses a generated function (side-eyes billy) and from there invokes cached_compilation which takes a lock.
This might "happen" to work if it hits the fast-pass and there is no lock contention...

@vchuravy
Copy link
Member

@MilesCranmer Can you try using Enzyme.set_abi(Reverse, NonGenABI) or Enzyme.set_abi(Forward, NonGenABI)?

@MilesCranmer
Copy link
Contributor Author

(Just confirming – my call to Enzyme is indeed not within any generated functions)

  • Enzyme.set_abi(Reverse, NonGenABI) => same error
  • Enzyme.set_abi(Forward, NonGenABI) => same error
  • [both] => same error

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

To be clear that’s not a global setting you would do

Enzyme.autodiff(set_abi(Forward, NonGenABI), …)

Can you also include the stack trace?

@MilesCranmer
Copy link
Contributor Author

Oh I see. I only have reverse-mode set up at the moment; here's the full error with that tweak:

julia> julia --project=. examples/parameterized_function.jl
[ Info: Training machine(SRRegressor(defaults = nothing, ), ).
[ Info: Started!
┌ Error: Problem fitting the machine machine(SRRegressor(defaults = nothing, ), ). 
└ @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:694
[ Info: Running type checks... 
[ Info: Type checks okay. 
ERROR: LoadError: TaskFailedException
Stacktrace:
  [1] wait(t::Task)
    @ Base ./task.jl:370
  [2] fetch
    @ ./task.jl:390 [inlined]
  [3] _main_search_loop!(state::SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:833
  [4] _equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing)
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:535
  [5] equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}; options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing, runtime_options::Nothing, runtime_options_kws::@Kwargs{niterations::Int64, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, return_state::Bool, run_id::Nothing, verbosity::Int64, logger::Nothing, progress::Nothing, v_dim_out::Val{1}})
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:525
  [6] equation_search
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:506 [inlined]
  [7] equation_search(X::Matrix{Float64}, y::Matrix{Float64}; niterations::Int64, weights::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, variable_names::Vector{String}, display_variable_names::Vector{String}, y_variable_names::Nothing, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, run_id::Nothing, loss_type::Type{Nothing}, verbosity::Int64, logger::Nothing, progress::Nothing, X_units::Nothing, y_units::Nothing, extra::@NamedTuple{class::Vector{Int64}}, v_dim_out::Val{1}, multithreaded::Nothing)
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:476
  [8] #equation_search#21
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:499 [inlined]
  [9] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Vector{Int64})
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:253
 [10] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Nothing)
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:220
 [11] update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing)
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:201
 [12] fit
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189 [inlined]
 [13] fit(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64})
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189
 [14] fit_only!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true}; rows::Nothing, verbosity::Int64, force::Bool, composite::Nothing)
    @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:692
 [15] fit_only!
    @ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:617 [inlined]
 [16] #fit!#63
    @ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:789 [inlined]
 [17] fit!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true})
    @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:786
 [18] top-level scope
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101

    nested task error: TaskFailedException
    Stacktrace:
     [1] wait(t::Task)
       @ Base ./task.jl:370
     [2] fetch
       @ ./task.jl:390 [inlined]
     [3] (::SymbolicRegression.var"#56#61"{SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, Int64, Int64})()
       @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:810
    
        nested task error: TaskFailedException
        Stacktrace:
          [1] wait(t::Task)
            @ Base ./task.jl:370
          [2] fetch
            @ ./task.jl:390 [inlined]
          [3] with_stacksize
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:31 [inlined]
          [4] (::SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}})(::Float64, G::Vector{Float64}, x::Vector{Float64})
            @ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:41
          [5] (::NLSolversBase.var"#69#70"{NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, Float64})(G::Vector{Float64}, x::Vector{Float64})
            @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/objective_types/incomplete.jl:54
          [6] value_gradient!!(obj::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
            @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
          [7] initial_state(method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}, d::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
            @ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/solvers/first_order/bfgs.jl:94
          [8] optimize
            @ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/optimize.jl:36 [inlined]
          [9] optimize(f::NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, initial_x::Vector{Float64}, method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
            @ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:143
         [10] optimize
            @ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:139 [inlined]
         [11] _optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, algorithm::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, optimizer_options::Optim.Options{Float64, Nothing}, idx::Nothing)
            @ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:76
         [12] dispatch_optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, idx::Nothing)
            @ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:46
         [13] optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
            @ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:27
         [14] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:118 [inlined]
         [15] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/Utils.jl:159 [inlined]
         [16] optimize_and_simplify_population(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, curmaxsize::Int64, record::Dict{String, Any})
            @ SymbolicRegression.SingleIterationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:109
         [17] _dispatch_s_r_cycle(in_pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}; pop::Int64, out::Int64, iteration::Int64, verbosity::Int64, cur_maxsize::Int64, running_search_statistics::SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics)
            @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:1087
         [18] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:762 [inlined]
         [19] (::SymbolicRegression.var"#53#55"{Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Float64, SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Int64, Task, SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics, Int64, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Int64})()
            @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SearchUtils.jl:263
        
            nested task error: AssertionError: Base.isconcretetype(typ)
            Stacktrace:
              [1] abs_typeof(arg::LLVM.ExtractValueInst, partial::Bool, seenphis::Set{LLVM.PHIInst})
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/absint.jl:614
              [2] abs_typeof(arg::LLVM.ExtractValueInst, partial::Bool)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/absint.jl:281
              [3] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:7095
              [4] codegen
                @ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:6072 [inlined]
              [5] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8375
              [6] cached_compilation(job::GPUCompiler.CompilerJob)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8416
              [7] thunkbase
                @ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8548 [inlined]
              [8] thunk
                @ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8631 [inlined]
              [9] autodiff
                @ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:473 [inlined]
             [10] autodiff
                @ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:537 [inlined]
             [11] autodiff
                @ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:504 [inlined]
             [12] (::SymbolicRegressionEnzymeExt.var"#1#2"{SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Vector{Float64}, Vector{Float64}})()
                @ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101

The code is here: https://github.com/MilesCranmer/SymbolicRegression.jl/blob/667df823cd2dd111db524b7cb0c495d1a583eb89/ext/SymbolicRegressionEnzymeExt.jl#L42

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

Okay good news that’s now a different error

@MilesCranmer
Copy link
Contributor Author

Same error on main. Is there any way I can get more debugging info out of this to see where abs_typeof is blowing up? As far as I know there should be no type instabilities here, and this same code used to work ok.

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

what's the backtrace on main (and list commit you use). The line number you have shouldn't throw that error. With that I'll try to make a patch to add more info to the relevant assertion.

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

This might do it, if you give it a go: https://github.com/EnzymeAD/Enzyme.jl/pull/2149/files

@MilesCranmer
Copy link
Contributor Author

Thanks. Here's the printout on e69f3c2:

            nested task error: AssertionError: Illegal absint of   %.fca.45.0.extract = extractvalue { {} addrspace(10)*, {} addrspace(10)*, { i8, {} addrspace(10)*, {} addrspace(10)*, i64, i64 }, i64, float, float, i32, i8, i8, float, i64, i64, i8, i8, i8, i8, {} addrspace(10)*, i64, float, i8, i8, i64, {} addrspace(10)*, float, float, i8, i8, double, i64, i64, float, float, i64, i64, i8, i8, float, i64, i64, i64, i8, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, [1 x i64], i8, i8, i64, i8, {} addrspace(10)*, float, i64, {} addrspace(10)*, {} addrspace(10)*, float, {} addrspace(10)*, i64, i8, i64, i8, i8, {} addrspace(10)*, i8, i8, i8 } %2, 45, 0, !dbg !90 ltyp=Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, typ=Optim.AbstractOptimizer, offset=UInt32[0x0000002d, 0x00000000], ind=0

and the full backtrace:

1-element ExceptionStack:
LoadError: TaskFailedException
Stacktrace:
  [1] wait(t::Task)
    @ Base ./task.jl:370
  [2] fetch
    @ ./task.jl:390 [inlined]
  [3] _main_search_loop!(state::SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:833
  [4] _equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing)
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:535
  [5] equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}; options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing, runtime_options::Nothing, runtime_options_kws::@Kwargs{niterations::Int64, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, return_state::Bool, run_id::Nothing, verbosity::Int64, logger::Nothing, progress::Nothing, v_dim_out::Val{1}})
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:525
  [6] equation_search
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:506 [inlined]
  [7] equation_search(X::Matrix{Float64}, y::Matrix{Float64}; niterations::Int64, weights::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, variable_names::Vector{String}, display_variable_names::Vector{String}, y_variable_names::Nothing, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, run_id::Nothing, loss_type::Type{Nothing}, verbosity::Int64, logger::Nothing, progress::Nothing, X_units::Nothing, y_units::Nothing, extra::@NamedTuple{class::Vector{Int64}}, v_dim_out::Val{1}, multithreaded::Nothing)
    @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:476
  [8] #equation_search#21
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:499 [inlined]
  [9] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Vector{Int64})
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:253
 [10] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Nothing)
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:220
 [11] update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing)
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:201
 [12] fit
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189 [inlined]
 [13] fit(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64})
    @ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189
 [14] fit_only!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true}; rows::Nothing, verbosity::Int64, force::Bool, composite::Nothing)
    @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:692
 [15] fit_only!
    @ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:617 [inlined]
 [16] #fit!#63
    @ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:789 [inlined]
 [17] fit!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true})
    @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:786
 [18] top-level scope
    @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101
 [19] include(fname::String)
    @ Main ./sysimg.jl:38
 [20] top-level scope
    @ REPL[2]:1

    nested task error: TaskFailedException
    Stacktrace:
     [1] wait(t::Task)
       @ Base ./task.jl:370
     [2] fetch
       @ ./task.jl:390 [inlined]
     [3] (::SymbolicRegression.var"#56#61"{SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, Int64, Int64})()
       @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:810
    
        nested task error: TaskFailedException
        Stacktrace:
          [1] wait(t::Task)
            @ Base ./task.jl:370
          [2] fetch
            @ ./task.jl:390 [inlined]
          [3] with_stacksize
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:31 [inlined]
          [4] (::SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}})(::Float64, G::Vector{Float64}, x::Vector{Float64})
            @ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:41
          [5] (::NLSolversBase.var"#69#70"{NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, Float64})(G::Vector{Float64}, x::Vector{Float64})
            @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/objective_types/incomplete.jl:54
          [6] value_gradient!!(obj::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
            @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
          [7] initial_state(method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}, d::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
            @ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/solvers/first_order/bfgs.jl:94
          [8] optimize
            @ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/optimize.jl:36 [inlined]
          [9] optimize(f::NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, initial_x::Vector{Float64}, method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
            @ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:143
         [10] optimize
            @ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:139 [inlined]
         [11] _optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, algorithm::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, optimizer_options::Optim.Options{Float64, Nothing}, idx::Nothing)
            @ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:76
         [12] dispatch_optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, idx::Nothing)
            @ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:46
         [13] optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
            @ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:27
         [14] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:118 [inlined]
         [15] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/Utils.jl:159 [inlined]
         [16] optimize_and_simplify_population(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, curmaxsize::Int64, record::Dict{String, Any})
            @ SymbolicRegression.SingleIterationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:109
         [17] _dispatch_s_r_cycle(in_pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}; pop::Int64, out::Int64, iteration::Int64, verbosity::Int64, cur_maxsize::Int64, running_search_statistics::SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics)
            @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:1087
         [18] macro expansion
            @ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:762 [inlined]
         [19] (::SymbolicRegression.var"#53#55"{Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Float64, SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Int64, Task, SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics, Int64, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Int64})()
            @ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SearchUtils.jl:263
        
            nested task error: AssertionError: Illegal absint of   %.fca.45.0.extract = extractvalue { {} addrspace(10)*, {} addrspace(10)*, { i8, {} addrspace(10)*, {} addrspace(10)*, i64, i64 }, i64, float, float, i32, i8, i8, float, i64, i64, i8, i8, i8, i8, {} addrspace(10)*, i64, float, i8, i8, i64, {} addrspace(10)*, float, float, i8, i8, double, i64, i64, float, float, i64, i64, i8, i8, float, i64, i64, i64, i8, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, [1 x i64], i8, i8, i64, i8, {} addrspace(10)*, float, i64, {} addrspace(10)*, {} addrspace(10)*, float, {} addrspace(10)*, i64, i8, i64, i8, i8, {} addrspace(10)*, i8, i8, i8 } %2, 45, 0, !dbg !90 ltyp=Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, typ=Optim.AbstractOptimizer, offset=UInt32[0x0000002d, 0x00000000], ind=0
            Stacktrace:
              [1] abs_typeof(arg::LLVM.Value, partial::Bool, seenphis::Set{LLVM.PHIInst})
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/absint.jl:642
              [2] abs_typeof
                @ ~/.julia/packages/Enzyme/JYslI/src/absint.jl:283 [inlined]
              [3] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:5257
              [4] codegen
                @ ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:4196 [inlined]
              [5] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6298
              [6] cached_compilation(job::GPUCompiler.CompilerJob)
                @ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6339
              [7] thunkbase
                @ ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6452 [inlined]
              [8] thunk
                @ ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6535 [inlined]
              [9] autodiff
                @ ~/.julia/packages/Enzyme/JYslI/src/Enzyme.jl:485 [inlined]
             [10] autodiff
                @ ~/.julia/packages/Enzyme/JYslI/src/Enzyme.jl:544 [inlined]
             [11] autodiff
                @ ~/.julia/packages/Enzyme/JYslI/src/Enzyme.jl:516 [inlined]
             [12] (::SymbolicRegressionEnzymeExt.var"#1#2"{SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Vector{Float64}, Vector{Float64}})()
                @ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

Do you know what this type is from?


            Options{
                SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}},
                ParametricNode,
                ParametricExpression,
                @NamedTuple{max_parameters::Int64},
                MutationWeights,
                false, false,
                nothing,
                ADTypes.AutoEnzyme{Nothing, Nothing},
                5
            }

and what the 0-indexed 45th element type is (or 46-th 1 indexed). And then of that result, what the 0-th (0 indexed) type is?

@MilesCranmer
Copy link
Contributor Author

That is the options::Options variable which holds evaluation context for the expression eval. It's passed to Enzyme here: https://github.com/MilesCranmer/SymbolicRegression.jl/blob/667df823cd2dd111db524b7cb0c495d1a583eb89/ext/SymbolicRegressionEnzymeExt.jl#L47

Looks like the 46th 1-indexed field is expression_type:

julia> options = Options();

julia> propertynames(options)[46]
:expression_type

This is the ParametricExpression bit. The type parameters of ParametricExpression are unknown since the Options struct only passes the wrapper type (the full type would be something like ParametricExpression{Float64,..}).

The ParametricExpression is defined here: https://github.com/SymbolicML/DynamicExpressions.jl/blob/5cf61b47feae317e80a3703b288c84b8e1f4fd7e/src/ParametricExpression.jl#L71-L82

struct ParametricExpression{
    T,
    N<:ParametricNode{T},
    D<:NamedTuple{(:operators, :variable_names, :parameters, :parameter_names)},
} <: AbstractExpression{T,N}
    tree::N
    metadata::Metadata{D}

    function ParametricExpression(tree::ParametricNode, metadata::Metadata)
        return new{eltype(tree),typeof(tree),typeof(_data(metadata))}(tree, metadata)
    end
end

However, this is only stored in the options as a constructor. The actual object evaluated should have a fully-resolved type.

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

hm, rephrasing what is the 46th non ghost type (aka type which has non-0 size storage.

And yeah what is the relevant subtypes of the fully specialized type above?

@MilesCranmer
Copy link
Contributor Author

They should be something like:

ParametricExpression{
    Float64,
    ParametricNode{Float64},
    @NamedTuple{
        operators::DynamicExpressions.OperatorEnumModule.OperatorEnum{
            Tuple{typeof(+),typeof(*),typeof(-),typeof(/)},Tuple{}
        },
        variable_names::Vector{String},
        parameters::Matrix{Float64},
        parameter_names::Vector{String},
    }
}

from

julia> using SymbolicRegression

julia> options = Options(expression_type=ParametricExpression, binary_operators=[+, *, -, /]);

julia> x1 = ParametricNode{Float64}(; feature=1)
x1

julia> ex = ParametricExpression(x1; parameters=ones(Float64, 1, 1), parameter_names=["p1"], variable_names=["x1"], operators=options.operators)
x1

julia> typeof(ex)
ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(-), typeof(/)}, Tuple{}}, variable_names::Vector{String}, parameters::Matrix{Float64}, parameter_names::Vector{String}}}

@MilesCranmer
Copy link
Contributor Author

hm, rephrasing what is the 46th non ghost type (aka type which has non-0 size storage.

Is there a way to filter non-ghost types?

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

okay so this is basically the code we run, which is itself erring during this:

using Enzyme, SymbolicRegression, DynamicExpressions, ADTypes

offset=UInt32[0x0000002d, 0x00000000]

typ = Options{
    SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}},
    ParametricNode,
    ParametricExpression,
    @NamedTuple{max_parameters::Int64},
    MutationWeights,
    false, false,
    nothing,
    ADTypes.AutoEnzyme{Nothing, Nothing},
    5
}

ltyp = typ
for ind in offset
    if !Base.isconcretetype(typ)
        throw(AssertionError("Illegal absint ltyp=$ltyp, typ=$typ, offset=$offset, ind=$ind"))
    end
    cnt = 0
    for i = 1:fieldcount(typ)
        styp = Enzyme.Compiler.typed_fieldtype(typ, i)
        if Enzyme.Compiler.isghostty(styp)
            continue
        end
        if cnt == ind
            typ = styp
            break
        end
        cnt += 1
    end
    @show cnt, typ
end

slightly modified from https://github.com/EnzymeAD/Enzyme.jl/blob/79678f7a93fd1a65ffc633ed132baf8d33a1b4f8/src/absint.jl#L639C1-L656C16

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

oh bleh I see, unions mess up the indexing count:

struct Pair{A,B}
	a::A
	b::B
end

Pair{Vector{Float32}, Union{Float32, Nothing}}

convert(LLVMType, Pair{Vector{Float32}, Union{Float32, Nothing}})
# { {} addrspace(10)*, i32, i8 }

@MilesCranmer
Copy link
Contributor Author

Might it be the optimizer_algorithm::Optim.AbstractOptimizer? I specialized that and now instead of the assertion error I get an Abort trap: 6

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

does this fix it for you: #2155

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

okay with that branch, the code at the top works for me

julia> model = SRRegressor(;
           niterations=100,
           binary_operators=[+, *, /, -],
           unary_operators=[cos, exp],
           populations=30,
           expression_type=ParametricExpression,
           expression_options=(; max_parameters=2),
           autodiff_backend=:Enzyme,
       );

julia> mach = machine(model, X, y)
untrained Machine; caches model-specific representations of data
  model: SRRegressor(defaults = nothing, …)
  args: 
    1:	Source @543 ⏎ ScientificTypesBase.Table{Union{AbstractVector{ScientificTypesBase.Continuous}, AbstractVector{ScientificTypesBase.Count}}}
    2:	Source @128 ⏎ AbstractVector{ScientificTypesBase.Continuous}


julia> fit!(mach)
[ Info: Training machine(SRRegressor(defaults = nothing, …), …).
┌ Warning: You are using multithreading mode, but only one thread is available. Try starting julia with `--threads=auto`.
└ @ SymbolicRegression ~/.julia/packages/SymbolicRegression/44X04/src/Configure.jl:59
[ Info: Started!
Evolving for 100 iterations... 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| Time: 0:01:40
[ Info: Final population:
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           3.326e+00  3.604e+01  y = p1
4           2.081e+00  1.564e-01  y = cos(x1) * p2
5           1.132e+00  6.083e-01  y = p2 + (x1 * x1)
7           1.105e+00  1.214e-02  y = p2 + ((0.19837 + x1) * x1)
8           6.571e-01  5.201e-01  y = p2 + ((x1 * x1) + cos(x2))
9           2.134e-01  1.125e+00  y = ((x2 * p1) + (x1 * x1)) + p2
11          2.067e-01  1.580e-02  y = (x2 * p1) + (p2 + (1.0569 * (x1 * x1)))
13          5.673e-02  6.466e-01  y = (cos(p2 + x2) + (p2 - (cos(x1) / 0.56366))) * 1.875
14          5.094e-02  1.076e-01  y = 1.8621 * (p2 + (cos(x2 + p2) - (cos(x1) / cos(p1))))
15          1.378e-02  1.308e+00  y = 2.0677 * ((p2 + cos(p2 + x2)) - (cos(p1 * x1) / 0.57136))
17          1.256e-03  1.197e+00  y = (((p2 + cos(p2 + x2)) - (cos(x1 * 0.37438) / 0.13134)) - -5.881) * 1.9956
19          4.650e-04  4.970e-01  y = (((p2 - (cos(x1 * 0.37002) / 0.12908)) + cos((p1 + p2) + x2)) - -5.94) * 2.0078
21          2.128e-04  3.908e-01  y = 2.0057 * (((cos(x2 + p2) + 5.0129) - p1) + (((0.37194 - cos(-0.30475 * x1)) / 0.089255) + p2))
23          1.113e-04  3.243e-01  y = (((((cos(x2 + p2) + 0.041882) + p2) - p1) + ((0.47219 - cos(x1 * -0.26001)) / 0.065669)) - -5.9609) * 2.0039
27          1.101e-04  2.530e-03  y = ((1.8074 * ((((0.042485 - p1) + (p2 + cos(x2 + p2))) + ((0.48085 - cos(-0.25793 * x1)) / 0.064551)) - -5.9615)) / 0.90307) - -0.0072206
28          7.346e-05  4.050e-01  y = (((cos(x2 + p2) + ((0.21833 - cos(-0.3405 * x1)) / 0.11298)) + (((0.13962 - p1) - -4.7783) + p2)) * 1.9901) / cos(x1 * -0.11985)
30          7.313e-05  2.270e-03  y = (1.99 * (((((0.19832 - cos(x1 * -0.3484)) / 0.11832) + cos(x2 + p2)) + (((0.13655 - p1) - -4.7752) + p2)) / cos(x1 * -0.12135))) + p1
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
[ Info: Results saved to:
  - outputs/20241201_173537_kDoCr1/hall_of_fame.csv
trained Machine; caches model-specific representations of data
  model: SRRegressor(defaults = nothing, …)
  args: 
    1:	Source @543 ⏎ ScientificTypesBase.Table{Union{AbstractVector{ScientificTypesBase.Continuous}, AbstractVector{ScientificTypesBase.Count}}}
    2:	Source @128 ⏎ AbstractVector{ScientificTypesBase.Continuous}

@MilesCranmer
Copy link
Contributor Author

Nice!!! Thanks.

(I still seem to get segfaults after running for a bit; is that something about the caching interacting with the multithreading?)

[61256] signal 11 (2): Segmentation fault: 11 second: 4.16e+04. Press 'q' and then <enter> to stop execution early.
in expression starting at REPL[10]:1                                                                                                                                                                                                                                         
gc_mark_obj8 at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:0 [inlined]────────────────────────────────────────────── 
gc_mark_outrefs at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2888 [inlined]                                         
gc_mark_and_steal at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2993                                                 
gc_mark_loop_parallel at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3141                                             
jl_parallel_gc_threadfun at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/scheduler.c:151                                    
_pthread_start at /usr/lib/system/libsystem_pthread.dylib (unknown line)                                                                                                                                                                                                     
Allocations: 785098264 (Pool: 782822967; Big: 2275297); GC: 866   

@MilesCranmer
Copy link
Contributor Author

Also are you running with NonGenABI or with it off?

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

I'm just doing the code I pasted directly without changes.

And I'm not seeing a segfault, but there's a segfault in Julia itself which we found before (but apparently wasn't backported to 1.10 yet). It's possible it's that, but if not we also should try to fix that too.

See JuliaLang/julia#55306 / JuliaLang/julia#56653 for the current julia segfault.

No idea of course if that's what you're hitting

@MilesCranmer
Copy link
Contributor Author

MilesCranmer commented Dec 1, 2024

Thanks. I'm on 1.11 at the moment (and macOS). I'll check if using 1.10 fixes anything.

With the NonGenABI it also seems to work which is great (let me know if I should close this?).

I get a similar segfault. This happens after 10 iterations have passed, meaning we are successfully doing a lot of Enzyme-based optimizations, which is good! So I guess it's a bug in the Julia GC maybe?

Evolving for 100 iterations...  10%|█████████████████████▍                                                                                                                                                                                            |  ETA: 0:01:07
[61781] signal 11 (2): Segmentation fault: 11 second: 3.99e+04. Press 'q' and then <enter> to stop execution early.
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:102                                                                                                                             
gc_mark_obj8 at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:0 [inlined]────────────────────────────────────────────── 
gc_mark_outrefs at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2888 [inlined]                                         
gc_mark_and_steal at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2993                                                 
gc_mark_loop_parallel at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3133 [inlined]                                   
gc_mark_loop at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3152                                                      
_jl_gc_collect at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3538                                                    
ijl_gc_collect at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3899                                                    
maybe_collect at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:922 [inlined]                                            
jl_gc_pool_alloc_inner at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:1325 [inlined]                                  
ijl_gc_pool_alloc_instrumented at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:1383                                    
_eval_tree_array at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/Evaluate.jl:187 x1)                                                                                                                                                                         
#eval_tree_array#2 at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/Evaluate.jl:1561.2409                                                                                                                                                                     
eval_tree_array at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/Evaluate.jl:131 [inlined]2)                                                                                                                                                                  
#eval_tree_array#17 at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/ParametricExpression.jl:353 - (p2 - -0.092549)))                                                                                                                                         
eval_tree_array at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/ParametricExpression.jl:335 [inlined]) - p2) - ((x2 * -0.15569) / (p1 - 0.99632)))                                                                                                           
eval_tree_dispatch at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ParametricExpression.jl:84 [inlined]────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────── 
_eval_loss at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/LossFunctions.jl:87
#eval_loss#3 at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/LossFunctions.jl:146
eval_loss at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/LossFunctions.jl:138 [inlined]
evaluator at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:27 [inlined]
evaluator at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:0 [inlined]
diffejulia_evaluator_33238_inner_42wrap at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:0
macro expansion at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/compiler.jl:5190 [inlined]
enzyme_call at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/compiler.jl:4736 [inlined]
CombinedAdjointThunk at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/compiler.jl:4608 [inlined]
autodiff at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/Enzyme.jl:503 [inlined]
autodiff at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/Enzyme.jl:544 [inlined]
autodiff at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/Enzyme.jl:516 [inlined]
#1 at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
unknown function (ip: 0x36b8142ff)
jl_apply at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/./julia.h:2157 [inlined]
start_task at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/task.c:1202
Allocations: 364229539 (Pool: 363378515; Big: 851024); GC: 419

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

hm, yeah I ran to completion on 1.10 on macos [latest enzyme commit].

I know it also was backported to 1.11 (JuliaLang/julia#55344), so that means it's likely something separate.

Can you open a different issue for the segfault with as simplified a MWE as possible (having for loops and or manual GC.gc calls is fine, but ideally it's some code with a single autodiff call [perhaps in a loop] with as simple an inner function as possible)

@MilesCranmer
Copy link
Contributor Author

Confirming I can run it on 1.10 too. It's just 1.11 that hits the segfault. So I guess a Julia bug?

Given how random the bug is, and the interaction with multithreading, and my limited knowledge of Julia's GC process, I think it will be pretty hard for me to make a clean MWE. I could try to make an rr trace though if that's useful? (Assuming I can repro on linux)

@wsmoses
Copy link
Member

wsmoses commented Dec 1, 2024

Yeah totally understood — tho of course making it as simple as possible would be helpful for trying to fix.

@MilesCranmer
Copy link
Contributor Author

x-posted to JuliaLang/julia#56735

@MilesCranmer
Copy link
Contributor Author

MilesCranmer commented Dec 2, 2024

Crazy thing is when I run with --threads=auto --gcthreads=1, the segfault goes away. It's only when the gc is parallel does this segfault happen. The code itself can be parallel though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants