Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VGG example fails on GPU #42

Closed
tbenst opened this issue Apr 11, 2019 · 4 comments
Closed

VGG example fails on GPU #42

tbenst opened this issue Apr 11, 2019 · 4 comments

Comments

@tbenst
Copy link

tbenst commented Apr 11, 2019

The code does run successfully on CPU, but as soon as I get CuArrays involved I get this issue. Same problem if I try x = rand(Float32, 224, 224, 3, 1); vgg(x);

julia> using CuArrays, Metalhead, Flux
julia> using Metalhead: classify
julia> vgg = VGG19() |> gpu
VGG19()
julia> img = load("/home/tyler/Downloads/elephant.jpg");
julia> classify(vgg,img)
ERROR: conversion to pointer not defined for CuArray{Float32,4}
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] unsafe_convert(::Type{Ptr{Float32}}, ::CuArray{Float32,4}) at ./pointer.jl:67
 [3] pointer(::CuArray{Float32,4}) at ./abstractarray.jl:880
 [4] #conv2d!#39(::Float32, ::Function, ::Array{Float32,4}, ::Array{Float32,4}, ::CuArray{Float32,4}, ::NNlib.ConvDims{(224, 224),(3, 3),3,(1, 1),(1, 1, 1, 1),(1, 1),true}) at /home/tyler/.julia/packages/NNlib/UpABH/src/impl/conv.jl:350
 [5] (::getfield(NNlib, Symbol("#kw##conv2d!")))(::NamedTuple{(:alpha,),Tuple{Float32}}, ::typeof(NNlib.conv2d!), ::Array{Float32,4}, ::Array{Float32,4}, ::CuArray{Float32,4}, ::NNlib.ConvDims{(224, 224),(3, 3),3,(1, 1),(1, 1, 1, 1),(1, 1),true}) at ./none:0
 [6] #conv2d!#40(::Tuple{Int64,Int64}, ::Tuple{Int64,Int64}, ::Tuple{Int64,Int64}, ::Int64, ::Float32, ::Function, ::Array{Float32,4}, ::Array{Float32,4}, ::CuArray{Float32,4}) at /home/tyler/.julia/packages/NNlib/UpABH/src/impl/conv.jl:373
 [7] #conv2d! at ./none:0 [inlined]
 [8] #conv!#68 at /home/tyler/.julia/packages/NNlib/UpABH/src/conv.jl:118 [inlined]
 [9] #conv! at ./none:0 [inlined]
 [10] #conv#54(::Nothing, ::Tuple{Int64,Int64}, ::Tuple{Int64,Int64}, ::Tuple{Int64,Int64}, ::Function, ::Array{Float32,4}, ::CuArray{Float32,4}) at /home/tyler/.julia/packages/NNlib/UpABH/src/conv.jl:62
 [11] #conv at ./none:0 [inlined]
 [12] (::Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}})(::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/lz7S9/src/layers/conv.jl:53
 [13] Conv at /home/tyler/.julia/packages/Flux/lz7S9/src/layers/conv.jl:63 [inlined]
 [14] applychain at /home/tyler/.julia/packages/Flux/lz7S9/src/layers/basic.jl:31 [inlined]
 [15] (::Chain{Tuple{Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##42#48")),Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##43#49")),Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##44#50")),Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##45#51")),Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##46#52")),getfield(Metalhead, Symbol("##47#53")),Dense{typeof(relu),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},Dropout{Float32},Dense{typeof(relu),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},Dropout{Float32},Dense{typeof(identity),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},typeof(softmax)}})(::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/lz7S9/src/layers/basic.jl:33
 [16] (::VGG19)(::Array{Float32,4}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/vgg19.jl:46
 [17] forward(::VGG19, ::Array{ColorTypes.RGB{FixedPointNumbers.Normed{UInt8,8}},2}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/utils.jl:70
 [18] classify(::VGG19, ::Array{ColorTypes.RGB{FixedPointNumbers.Normed{UInt8,8}},2}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/utils.jl:98
 [19] top-level scope at none:0

Edit: here are my manifest & project files: https://gist.github.com/tbenst/f370a5baae6e120c5869f591d5006794 for Julia 1.1

@tbenst
Copy link
Author

tbenst commented Apr 11, 2019

Hmm, seems to be a different issue: FluxML/Flux.jl#287 (comment)

@tbenst tbenst closed this as completed Apr 11, 2019
@tbenst
Copy link
Author

tbenst commented May 13, 2019

So I added CUDNN, but am still getting a similar error:

julia> CuArrays.CUDNN.version()
v"7.4.2"
julia> classify(vgg,img)
ERROR: conversion to pointer not defined for CuArray{Float32,5}
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] unsafe_convert(::Type{Ptr{Float32}}, ::CuArray{Float32,5}) at ./pointer.jl:67
 [3] pointer(::CuArray{Float32,5}) at ./abstractarray.jl:880
 [4] macro expansion at /home/tyler/.julia/packages/NNlib/mxWRT/src/impl/conv_im2col.jl:55 [inlined]
 [5] macro expansion at ./gcutils.jl:87 [inlined]
 [6] macro expansion at /home/tyler/.julia/packages/NNlib/mxWRT/src/impl/conv_im2col.jl:53 [inlined]
 [7] #conv_im2col!#231(::Array{Float32,2}, ::Float32, ::Float32, ::Function, ::Array{Float32,5}, ::Array{Float32,5}, ::CuArray{Float32,5}, ::DenseConvDims{3,(3, 3, 1),3,64,(1, 1, 1),(1, 1, 1, 1, 0, 0),(1, 1, 1),false}) at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:190
 [8] conv_im2col! at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:198 [inlined]
 [9] macro expansion at /home/tyler/.julia/packages/NNlib/mxWRT/src/conv.jl:51 [inlined]
 [10] #conv!#37 at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:190 [inlined]
 [11] conv!(::Array{Float32,5}, ::Array{Float32,5}, ::CuArray{Float32,5}, ::DenseConvDims{3,(3, 3, 1),3,64,(1, 1, 1),(1, 1, 1, 1, 0, 0),(1, 1, 1),false}) at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:198
 [12] #conv!#56(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Array{Float32,4}, ::Array{Float32,4}, ::CuArray{Float32,4}, ::DenseConvDims{2,(3, 3),3,64,(1, 1),(1, 1, 1, 1),(1, 1),false}) at /home/tyler/.julia/packages/NNlib/mxWRT/src/conv.jl:68
 [13] conv!(::Array{Float32,4}, ::Array{Float32,4}, ::CuArray{Float32,4}, ::DenseConvDims{2,(3, 3),3,64,(1, 1),(1, 1, 1, 1),(1, 1),false}) at /home/tyler/.julia/packages/NNlib/mxWRT/src/conv.jl:68
 [14] macro expansion at /home/tyler/.julia/packages/NNlib/mxWRT/src/conv.jl:114 [inlined]
 [15] #conv#97(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Array{Float32,4}, ::CuArray{Float32,4}, ::DenseConvDims{2,(3, 3),3,64,(1, 1),(1, 1, 1, 1),(1, 1),false}) at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:190
 [16] conv(::Array{Float32,4}, ::CuArray{Float32,4}, ::DenseConvDims{2,(3, 3),3,64,(1, 1),(1, 1, 1, 1),(1, 1),false}) at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:198
 [17] (::Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}})(::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/qXNjB/src/layers/conv.jl:55
 [18] applychain(::Tuple{Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##42#48")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##43#49")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##44#50")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##45#51")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##46#52")),getfield(Metalhead, Symbol("##47#53")),Dense{typeof(relu),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},Dropout{Float32},Dense{typeof(relu),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},Dropout{Float32},Dense{typeof(identity),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},typeof(softmax)}, ::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/qXNjB/src/layers/basic.jl:31
 [19] (::Chain{Tuple{Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##42#48")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##43#49")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##44#50")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##45#51")),Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},Conv{2,2,typeof(relu),CuArray{Float32,4},CuArray{Float32,1}},getfield(Metalhead, Symbol("##46#52")),getfield(Metalhead, Symbol("##47#53")),Dense{typeof(relu),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},Dropout{Float32},Dense{typeof(relu),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},Dropout{Float32},Dense{typeof(identity),LinearAlgebra.Adjoint{Float32,CuArray{Float32,2}},CuArray{Float32,1}},typeof(softmax)}})(::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/qXNjB/src/layers/basic.jl:33
 [20] (::VGG19)(::Array{Float32,4}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/vgg19.jl:46
 [21] forward(::VGG19, ::Array{ColorTypes.RGB{FixedPointNumbers.Normed{UInt8,8}},2}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/utils.jl:70
 [22] classify(::VGG19, ::Array{ColorTypes.RGB{FixedPointNumbers.Normed{UInt8,8}},2}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/utils.jl:98
 [23] top-level scope at none:0

@tbenst tbenst reopened this May 13, 2019
@tbenst
Copy link
Author

tbenst commented May 13, 2019

if helpful, here are the Flux tests:

[ Info: Testing Layers
0.044721193126701045
[ Info: Running Gradient Checks
[ Info: Testing GPU Support
[ Info: Testing Flux/CUDNN
batch_size = 1: Error During Test at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:7
  Got exception outside of a @test
  CUDNNError(code 3, CUDNN_STATUS_BAD_PARAM)
  Stacktrace:
   [1] macro expansion at /home/tyler/.julia/packages/CuArrays/PwSdF/src/dnn/error.jl:19 [inlined]
   [2] cudnnRNNBackwardData(::Flux.CUDA.RNNDesc{Float32}, ::Int64, ::Array{CuArrays.CUDNN.TensorDesc,1}, ::CuArray{Float32,1}, ::Array{CuArrays.CUDNN.TensorDesc,1}, ::CuArray{Float32,1}, ::CuArrays.CUDNN.TensorDesc, ::CuArray{Float32,1}, ::Ptr{Nothing}, ::CUDAdrv.CuPtr{Nothing}, ::CuArrays.CUDNN.FilterDesc, ::CuArray{Float32,1}, ::CuArrays.CUDNN.TensorDesc, ::CuArray{Float32,1}, ::Ptr{Nothing}, ::CUDAdrv.CuPtr{Nothing}, ::Array{CuArrays.CUDNN.TensorDesc,1}, ::CuArray{Float32,1}, ::CuArrays.CUDNN.TensorDesc, ::CuArray{Float32,1}, ::Ptr{Nothing}, ::CUDAdrv.CuPtr{Nothing}, ::CuArray{UInt8,1}, ::CuArray{UInt8,1}) at /home/tyler/.julia/packages/Flux/qXNjB/src/cuda/curnn.jl:170
   [3] backwardData(::Flux.CUDA.RNNDesc{Float32}, ::CuArray{Float32,1}, ::CuArray{Float32,1}, ::CuArray{Float32,1}, ::Nothing, ::CuArray{Float32,1}, ::Nothing, ::CuArray{UInt8,1}) at /home/tyler/.julia/packages/Flux/qXNjB/src/cuda/curnn.jl:187
   [4] backwardData(::Flux.CUDA.RNNDesc{Float32}, ::CuArray{Float32,1}, ::CuArray{Float32,1}, ::CuArray{Float32,1}, ::CuArray{Float32,1}, ::CuArray{UInt8,1}) at /home/tyler/.julia/packages/Flux/qXNjB/src/cuda/curnn.jl:195
   [5] (::getfield(Flux.CUDA, Symbol("##8#9")){Flux.GRUCell{TrackedArray{…,CuArray{Float32,2}},TrackedArray{…,CuArray{Float32,1}}},TrackedArray{…,CuArray{Float32,1}},TrackedArray{…,CuArray{Float32,1}},CuArray{UInt8,1},Tuple{CuArray{Float32,1},CuArray{Float32,1}}})(::Tuple{CuArray{Float32,1},CuArray{Float32,1}}) at /home/tyler/.julia/packages/Flux/qXNjB/src/cuda/curnn.jl:306
   [6] back_(::Tracker.Call{getfield(Flux.CUDA, Symbol("##8#9")){Flux.GRUCell{TrackedArray{…,CuArray{Float32,2}},TrackedArray{…,CuArray{Float32,1}}},TrackedArray{…,CuArray{Float32,1}},TrackedArray{…,CuArray{Float32,1}},CuArray{UInt8,1},Tuple{CuArray{Float32,1},CuArray{Float32,1}}},Tuple{Tracker.Tracked{CuArray{Float32,1}},Tracker.Tracked{CuArray{Float32,1}},Tracker.Tracked{CuArray{Float32,2}},Tracker.Tracked{CuArray{Float32,2}},Tracker.Tracked{CuArray{Float32,1}}}}, ::Tuple{CuArray{Float32,1},CuArray{Float32,1}}, ::Bool) at /home/tyler/.julia/packages/Tracker/AIiYy/src/back.jl:35
   [7] back(::Tracker.Tracked{Tuple{CuArray{Float32,1},CuArray{Float32,1}}}, ::Tuple{CuArray{Float32,1},Int64}, ::Bool) at /home/tyler/.julia/packages/Tracker/AIiYy/src/back.jl:58
   [8] (::getfield(Tracker, Symbol("##13#14")){Bool})(::Tracker.Tracked{Tuple{CuArray{Float32,1},CuArray{Float32,1}}}, ::Tuple{CuArray{Float32,1},Int64}) at /home/tyler/.julia/packages/Tracker/AIiYy/src/back.jl:38
   [9] foreach(::Function, ::Tuple{Tracker.Tracked{Tuple{CuArray{Float32,1},CuArray{Float32,1}}},Nothing}, ::Tuple{Tuple{CuArray{Float32,1},Int64},Nothing}) at ./abstractarray.jl:1867
   [10] back_(::Tracker.Call{getfield(Tracker, Symbol("##361#363")){Tracker.TrackedTuple{Tuple{CuArray{Float32,1},CuArray{Float32,1}}},Int64},Tuple{Tracker.Tracked{Tuple{CuArray{Float32,1},CuArray{Float32,1}}},Nothing}}, ::CuArray{Float32,1}, ::Bool) at /home/tyler/.julia/packages/Tracker/AIiYy/src/back.jl:38
   [11] back(::Tracker.Tracked{CuArray{Float32,1}}, ::CuArray{Float32,1}, ::Bool) at /home/tyler/.julia/packages/Tracker/AIiYy/src/back.jl:58
   [12] back!(::TrackedArray{…,CuArray{Float32,1}}, ::CuArray{Float32,1}) at /home/tyler/.julia/packages/Tracker/AIiYy/src/back.jl:77
   [13] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:23
   [14] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
   [15] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:7
   [16] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
   [17] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
   [18] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1083
   [19] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
   [20] include at ./boot.jl:326 [inlined]
   [21] include_relative(::Module, ::String) at ./loading.jl:1038
   [22] include(::Module, ::String) at ./sysimg.jl:29
   [23] include(::String) at ./client.jl:403
   [24] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/cuda.jl:45
   [25] include at ./boot.jl:326 [inlined]
   [26] include_relative(::Module, ::String) at ./loading.jl:1038
   [27] include(::Module, ::String) at ./sysimg.jl:29
   [28] include(::String) at ./client.jl:403
   [29] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/runtests.jl:30
   [30] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1083
   [31] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/runtests.jl:11
   [32] include at ./boot.jl:326 [inlined]
   [33] include_relative(::Module, ::String) at ./loading.jl:1038
   [34] include(::Module, ::String) at ./sysimg.jl:29
   [35] include(::String) at ./client.jl:403
   [36] top-level scope at none:0
   [37] eval(::Module, ::Any) at ./boot.jl:328
   [38] exec_options(::Base.JLOptions) at ./client.jl:243
   [39] _start() at ./client.jl:436
batch_size = 5: Test Failed at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:26
  Expression: ((rnn.cell).Wi).grad ≈ collect(((curnn.cell).Wi).grad)
   Evaluated: Float32[0.0264221 0.0311623 … 0.0401529 0.0481648; -0.00580488 -0.00684245 … -0.000633113 -0.00512767; … ; -0.488995 -0.397307 … -0.543058 -0.593239; -1.59764 -1.72794 … -1.24722 -2.0625] ≈ Float32[0.0163969 0.0200817 … 0.0357488 0.036319; -0.00383122 -0.00466102 … 0.000233916 -0.00279561; … ; -0.655613 -0.581466 … -0.616253 -0.790114; -1.32217 -1.42347 … -1.1262 -1.737]
Stacktrace:
 [1] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:26
 [2] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [3] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:7
 [4] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [5] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
 [6] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1083
 [7] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
batch_size = 5: Test Failed at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:27
  Expression: ((rnn.cell).Wh).grad ≈ collect(((curnn.cell).Wh).grad)
   Evaluated: Float32[-0.0155654 0.00326925 … -0.0217709 -0.00794538; 0.00331486 -0.000961465 … 0.00208863 -0.000682507; … ; 0.0600955 -0.0379895 … 0.0877803 0.056216; 0.13519 -0.049922 … 0.1269 0.0464798] ≈ Float32[-0.0102084 0.00063507 … -0.0177858 -0.00776742; 0.00226024 -0.000442876 … 0.00130408 -0.000717541; … ; 0.0800566 -0.047805 … 0.10263 0.0568791; 0.120198 -0.04255 … 0.115747 0.0459818]
Stacktrace:
 [1] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:27
 [2] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [3] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:7
 [4] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [5] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
 [6] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1083
 [7] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
batch_size = 5: Test Failed at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:28
  Expression: ((rnn.cell).b).grad ≈ collect(((curnn.cell).b).grad)
   Evaluated: Float32[0.0431294, -0.00524064, 0.00328069, 0.0383973, -0.0277038, -0.190451, 0.378964, 0.113537, -1.10465, -0.109885, -0.380812, -0.806893, -0.591657, -0.840546, -2.16732] ≈ Float32[0.0309576, -0.0028444, 0.00784941, 0.0508012, -0.0251556, -0.116478, 0.196908, 0.156512, -1.30342, -0.109, -0.0338723, -0.537076, -0.840815, -1.04284, -1.83286]
Stacktrace:
 [1] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:28
 [2] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [3] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:7
 [4] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [5] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
 [6] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1083
 [7] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
batch_size = 5: Test Failed at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:29
  Expression: ((rnn.cell).h).grad ≈ collect(((curnn.cell).h).grad)
   Evaluated: Float32[-0.0128217, -0.706767, -0.358611, -1.41596, -0.547122] ≈ Float32[0.0667142, -0.127551, -0.454483, -1.79769, -0.414044]
Stacktrace:
 [1] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:29
 [2] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [3] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:7
 [4] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1156
 [5] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
 [6] top-level scope at /build/source/usr/share/julia/stdlib/v1.1/Test/src/Test.jl:1083
 [7] top-level scope at /home/tyler/.julia/packages/Flux/qXNjB/test/cuda/curnn.jl:4
Test Summary:        | Pass  Fail  Error  Total
Flux                 |  254     4      1    259
  Throttle           |   11                  11
  Jacobian           |    1                   1
  Initialization     |   12                  12
  Params             |    2                   2
  Basic Stacking     |    1                   1
  Precision          |    6                   6
  Stacking           |    3                   3
  onecold            |    4                   4
  Optimise           |   11                  11
  Optimiser          |    3                   3
  Training Loop      |    2                   2
  basic              |   25                  25
  Dropout            |    8                   8
  BatchNorm          |   14                  14
  InstanceNorm       |   16                  16
  GroupNorm          |   16                  16
  losses             |   30                  30
  Pooling            |    2                   2
  CNN                |    1                   1
  asymmetric padding |    7                   7
  Depthwise Conv     |    4                   4
  ConvTranspose      |    1                   1
  Tracker            |    4                   4
  CuArrays           |    8                   8
  CUDNN BatchNorm    |   10                  10
  RNN                |   40     4      1     45
    R = Flux.RNN     |   16                  16
    R = Flux.GRU     |    6     4      1     11
      batch_size = 1 |    2            1      3
      batch_size = 5 |    4     4             8
    R = Flux.LSTM    |   18                  18
ERROR: LoadError: Some tests did not pass: 254 passed, 4 failed, 1 errored, 0 broken.
in expression starting at /home/tyler/.julia/packages/Flux/qXNjB/test/runtests.jl:9
ERROR: Package Flux errored during testing

The 4 failures are already documented: FluxML/Flux.jl#267.

And the Metalhead tests:

pkg> test Metalhead
   Testing Metalhead
    Status `/run/user/1000/tmpHQ4clH/Manifest.toml`
  [621f4979] AbstractFFTs v0.4.1
  [1520ce14] AbstractTrees v0.2.1
  [79e6a3ab] Adapt v0.4.2
  [13072b0f] AxisAlgorithms v0.3.0
  [39de3d68] AxisArrays v0.3.0
  [fbb218c0] BSON v0.2.3
  [9e28174c] BinDeps v0.8.10
  [b99e7846] BinaryProvider v0.5.4
  [00ebfdb7] CSTParser v0.5.2
  [aafaddc9] CatIndices v0.2.0
  [944b1d66] CodecZlib v0.5.2
  [3da002f7] ColorTypes v0.7.5
  [c3611d14] ColorVectorSpace v0.6.2
  [5ae59095] Colors v0.9.5
  [bbf7d656] CommonSubexpressions v0.2.0
  [34da2185] Compat v2.1.0
  [ed09eef8] ComputationalResources v0.3.0
  [8f4d0f93] Conda v1.2.0
  [150eb455] CoordinateTransformations v0.5.0
  [a8cc5b0e] Crayons v4.0.0
  [dc8bdbbb] CustomUnitRanges v0.2.0
  [864edb3b] DataStructures v0.15.0
  [163ba53b] DiffResults v0.0.4
  [b552c78f] DiffRules v0.0.10
  [b4f34e82] Distances v0.8.0
  [4f61f5a4] FFTViews v0.2.0
  [7a1cc6ca] FFTW v0.2.4
  [5789e2e9] FileIO v1.0.6
  [53c48c17] FixedPointNumbers v0.5.3
  [587475ba] Flux v0.8.3
  [f6369f11] ForwardDiff v0.10.3
  [a2bd30eb] Graphics v0.4.0
  [bbac6d45] IdentityRanges v0.3.0
  [2803e5a7] ImageAxes v0.6.0
  [a09fc81d] ImageCore v0.7.4
  [51556ac3] ImageDistances v0.1.1
  [6a3955dd] ImageFiltering v0.5.4
  [bc367c6b] ImageMetadata v0.6.1
  [787d08f9] ImageMorphology v0.1.1
  [4e3cecfd] ImageShow v0.2.0
  [02fcd773] ImageTransformations v0.8.0
  [916415d5] Images v0.17.3
  [9b13fd28] IndirectArrays v0.5.0
  [a98d9a8b] Interpolations v0.12.0
  [8197267c] IntervalSets v0.3.1
  [c8e1da08] IterTools v1.1.1
  [682c06a0] JSON v0.20.0
  [e5e0dc1b] Juno v0.7.0
  [1914dd2f] MacroTools v0.5.0
  [dbb5928d] MappedArrays v0.2.1
  [e89f7d12] Media v0.5.0
  [dbeba491] Metalhead v0.3.0
  [e1d29d7a] Missings v0.4.1
  [872c559c] NNlib v0.6.0
  [77ba4419] NaNMath v0.3.2
  [6fe1bfb0] OffsetArrays v0.11.0
  [bac558e1] OrderedCollections v1.1.0
  [5432bcbf] PaddedViews v0.4.2
  [92933f4c] ProgressMeter v0.9.0
  [b3c3ace0] RangeArrays v0.3.1
  [c84ed2f1] Ratios v0.3.1
  [189a3867] Reexport v0.2.0
  [ae029012] Requires v0.5.2
  [6038ab10] Rotations v0.11.1
  [699a6c99] SimpleTraits v0.8.0
  [a2af1166] SortingAlgorithms v0.3.1
  [276daf66] SpecialFunctions v0.7.2
  [90137ffa] StaticArrays v0.10.3
  [2913bbd2] StatsBase v0.30.0
  [06e1c1a7] TiledIteration v0.2.3
  [a759f4b9] TimerOutputs v0.5.0
  [0796e94c] Tokenize v0.5.3
  [9f7883ad] Tracker v0.2.1
  [3bb67fe8] TranscodingStreams v0.9.4
  [30578b45] URIParser v0.4.0
  [81def892] VersionParsing v1.1.3
  [efce3f68] WoodburyMatrices v0.4.1
  [a5390f91] ZipFile v0.8.1
  [2a0f44e3] Base64  [`@stdlib/Base64`]
  [ade2ca70] Dates  [`@stdlib/Dates`]
  [8bb1440f] DelimitedFiles  [`@stdlib/DelimitedFiles`]
  [8ba89e20] Distributed  [`@stdlib/Distributed`]
  [b77e0a4c] InteractiveUtils  [`@stdlib/InteractiveUtils`]
  [76f85450] LibGit2  [`@stdlib/LibGit2`]
  [8f399da3] Libdl  [`@stdlib/Libdl`]
  [37e2e46d] LinearAlgebra  [`@stdlib/LinearAlgebra`]
  [56ddb016] Logging  [`@stdlib/Logging`]
  [d6f4376e] Markdown  [`@stdlib/Markdown`]
  [a63ad114] Mmap  [`@stdlib/Mmap`]
  [44cfe95a] Pkg  [`@stdlib/Pkg`]
  [de0858da] Printf  [`@stdlib/Printf`]
  [9abbd945] Profile  [`@stdlib/Profile`]
  [3fa0cd96] REPL  [`@stdlib/REPL`]
  [9a3f8284] Random  [`@stdlib/Random`]
  [ea8e919c] SHA  [`@stdlib/SHA`]
  [9e88b42a] Serialization  [`@stdlib/Serialization`]
  [1a1011a3] SharedArrays  [`@stdlib/SharedArrays`]
  [6462fe0b] Sockets  [`@stdlib/Sockets`]
  [2f01184e] SparseArrays  [`@stdlib/SparseArrays`]
  [10745b16] Statistics  [`@stdlib/Statistics`]
  [8dfed614] Test  [`@stdlib/Test`]
  [cf7118a7] UUIDs  [`@stdlib/UUIDs`]
  [4ec0a83e] Unicode  [`@stdlib/Unicode`]
ERROR: LoadError: MethodError: no method matching maxpool(::Array{Float32,4}, ::Tuple{Int64,Int64})
Closest candidates are:
  maxpool(::AbstractArray{xT,N}, ::NNlib.PoolDims; kwargs...) where {xT, N} at /home/tyler/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:198
Stacktrace:
 [1] (::getfield(Metalhead, Symbol("##42#48")))(::Array{Float32,4}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/vgg19.jl:6
 [2] applychain(::Tuple{getfield(Metalhead, Symbol("##42#48")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##43#49")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##44#50")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##45#51")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##46#52")),getfield(Metalhead, Symbol("##47#53")),Flux.Dense{typeof(NNlib.relu),LinearAlgebra.Adjoint{Float32,Array{Float32,2}},Array{Float32,1}},Flux.Dropout{Float32},Flux.Dense{typeof(NNlib.relu),LinearAlgebra.Adjoint{Float32,Array{Float32,2}},Array{Float32,1}},Flux.Dropout{Float32},Flux.Dense{typeof(identity),LinearAlgebra.Adjoint{Float32,Array{Float32,2}},Array{Float32,1}},typeof(NNlib.softmax)}, ::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/qXNjB/src/layers/basic.jl:31 (repeats 3 times)
 [3] (::Flux.Chain{Tuple{Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##42#48")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##43#49")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##44#50")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##45#51")),Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},Flux.Conv{2,2,typeof(NNlib.relu),Array{Float32,4},Array{Float32,1}},getfield(Metalhead, Symbol("##46#52")),getfield(Metalhead, Symbol("##47#53")),Flux.Dense{typeof(NNlib.relu),LinearAlgebra.Adjoint{Float32,Array{Float32,2}},Array{Float32,1}},Flux.Dropout{Float32},Flux.Dense{typeof(NNlib.relu),LinearAlgebra.Adjoint{Float32,Array{Float32,2}},Array{Float32,1}},Flux.Dropout{Float32},Flux.Dense{typeof(identity),LinearAlgebra.Adjoint{Float32,Array{Float32,2}},Array{Float32,1}},typeof(NNlib.softmax)}})(::Array{Float32,4}) at /home/tyler/.julia/packages/Flux/qXNjB/src/layers/basic.jl:33
 [4] (::VGG19)(::Array{Float32,4}) at /home/tyler/.julia/packages/Metalhead/fYeSU/src/vgg19.jl:46
 [5] top-level scope at none:0
 [6] include at ./boot.jl:326 [inlined]
 [7] include_relative(::Module, ::String) at ./loading.jl:1038
 [8] include(::Module, ::String) at ./sysimg.jl:29
 [9] include(::String) at ./client.jl:403
 [10] top-level scope at none:0
in expression starting at /home/tyler/.julia/packages/Metalhead/fYeSU/test/runtests.jl:7
ERROR: Package Metalhead errored during testing

@darsnack
Copy link
Member

darsnack commented Aug 5, 2021

The underlying Flux issue has been resolved so I'm closing this.

@darsnack darsnack closed this as completed Aug 5, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants