You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Julia version: 1.7.3
Flux version: 0.13.3
CUDA version: 11.6
Minimal code:
using Flux, CUDA
m =Dense(10,5,softmax) |> gpu
m(cu(rand(10)))
Error:
ERROR: GPU broadcast resulted in non-concrete element type Union{}.
This probably means that the function you are broadcasting contains an error or type instability.
Stacktrace:
[1] error(s::String)
@ Base .\error.jl:33
[2] copy
@ C:\Users\Ben\.julia\packages\GPUArrays\Zecv7\src\host\broadcast.jl:44 [inlined]
[3] materialize
@ .\broadcast.jl:860 [inlined]
[4] (::Dense{typeof(softmax), CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}})(x::CuArray{Float32, 1, CUDA.Mem.DeviceBuffer})
@ Flux C:\Users\Ben\.julia\packages\Flux\js6mP\src\layers\basic.jl:159
[5] top-level scope
@ c:\Users\Ben\Documents\Dev\CY\CalabiYau.jl\src\test2.jl:4
Error occurs if softmax is replaced with logsoftmax, but runs fine with any other activation function. The minimal code is simple enough that I thought I must be missing something obvious but after many hours of searching I can't find an answer so have posted it here.
The text was updated successfully, but these errors were encountered:
Julia version: 1.7.3
Flux version: 0.13.3
CUDA version: 11.6
Minimal code:
Error:
Error occurs if softmax is replaced with logsoftmax, but runs fine with any other activation function. The minimal code is simple enough that I thought I must be missing something obvious but after many hours of searching I can't find an answer so have posted it here.
The text was updated successfully, but these errors were encountered: