-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trying ArrayFire #213
Comments
It seems to me that GPU convolution and recurrence are kernel specific. ArrayFire.jl has its own convolution function. Other basic functions can have its gradients by AutoGrad. |
Does your support for CPU convolution and rnn will apply to any array type like CuArray or AFArray? |
Yes, all functions that call cudnn or hand-written kernels in libknet8 are
currently KnetArray specific. There is no reason for this, they should
work with any raw CUDA pointer and dimension information, so we could write
methods to support other array types. CPU code will not work because it
has a for loop that goes through elements one at a time which would not be
efficient with GPU arrays. I am waiting for julia 0.7 to come out when
CUDAnative, GPUArrays, CLArrays, CuArrays will be supported out of the box
(i.e. will not require a julia recompilation) before seriously looking into
supporting multiple array types. However if you want to try it out, I can
help you write conv4/pool methods for other array types.
…On Wed, Dec 6, 2017 at 5:47 AM ngphuoc ***@***.***> wrote:
Does your support for CPU convolution and rnn will apply to any array type
like CuArray or AFArray?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#213 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpjAAtkvCewg724FbQcF8KVfiQiAVks5s9gBYgaJpZM4Q2SHa>
.
|
Thank a lot for your help. I am trying ArrayFire therefore it would be great if you could help with conv4/pool methods for ArrayFire. I'd like to try it for rnn too but if you don't have time, I may be able to follow your changes in conv4/pool and apply to rnn.jl. |
I did some tests following test/karray.jl. AFArray inherits AbstractArray but it seems some array operations are not supported yet: JuliaGPU/ArrayFire.jl#188 |
May I recommend duplicating conv.jl (or any other unimplemented op) and
replacing KnetArray type with AFArray type. The @cuda calls should work if
pointer(::AFArray) is defined correctly.
…On Thu, Dec 7, 2017, 06:44 ngphuoc ***@***.***> wrote:
I did some tests following test/karray.jl. AFArray inherits AbstractArray
but it seems some array operations are not supported yet:
JuliaGPU/ArrayFire.jl#188
<JuliaGPU/ArrayFire.jl#188>
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#213 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpns260q4A1P10dyuRaIoVn2_EJt0ks5s918MgaJpZM4Q2SHa>
.
|
I replaced all KnetArray with AFArray, and add the following to the top of src/conv.jl. Then I rebuilt Knet. The build went OK. using ArrayFire
# https://github.com/JuliaComputing/ArrayFire.jl/issues/189
# get_device_ptr, device_array, lock_device_ptr, unlock_device_ptr,
pointer{T}(a::AFArray{T})=convert(Ptr{T}, get_device_ptr(a))
pointer{T}(a::AFArray{T},i)=convert(Ptr{T}, get_device_ptr(a) + (i-1)*sizeof(T))
Base.similar(x::AFArray, s::Tuple) = AFArray(similar(Array(x),s))
AFArray{T}(len::Integer) where T = AFArray(Array{T}(len))
... When I ran this example: using Knet,ArrayFire
atype = Array{Float32}
gtype = AFArray
#= gtype = KnetArray =#
w = xavier(5,5,1,20) |> atype |> gtype
x = rand(28,28,1,128) |> atype |> gtype
conv4(w,x;padding=0) I got the error
If I add this line
it caused segfault at line at build @primitive conv4(w,x; o...),dy conv4w(w,x,dy;o...) conv4x(w,x,dy;o...) |
I am trying example in the example directory of Knet with ArrayFire.jl. So far I got housing.jl and mnist.jl working. For lenet.jl, I got the error below. Is convolution tied to only KnetArray?:
Below is the simplified version of lenet.jl I used:
The text was updated successfully, but these errors were encountered: