Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does Flux support N-dimensional convolutions, where N>3? #451

Open
bhvieira opened this issue Oct 22, 2018 · 20 comments
Open

Does Flux support N-dimensional convolutions, where N>3? #451

bhvieira opened this issue Oct 22, 2018 · 20 comments

Comments

@bhvieira
Copy link
Contributor

While this works:

Conv((1,1,1), 1=>1)(rand(1,1,1,1,1)) #3D convolution

This throws an error:

Conv((1,1,1,1), 1=>1)(rand(1,1,1,1,1,1)) #4D convolution
#ERROR: MethodError: no method matching conv!(::Array{Float64,6}, ::Array{Float64,6}, ::Array{Float64,6}; pad=(0, 0, 0, 0), stride=(1, 1, 1, 1), dilation=(1, 1, 1, 1))

I'm running Julia 0.7 and:

[587475ba] Flux v0.6.7+ #master (https://github.com/FluxML/Flux.jl.git)
[872c559c] NNlib v0.4.2+ #master (https://github.com/FluxML/NNlib.jl.git)
@MikeInnes
Copy link
Member

Not right now; each of 1-, 2- and 3-D convolutions just calls out to a specific kernel. But I'd happily take an implementation of an N-D convolution in NNlib, even if it's not heavily optimised.

@bhvieira
Copy link
Contributor Author

@MikeInnes Right, thanks for the answer, I might look into it. I have 3D timeseries (fMRI volumes), and I did not want to go overboard with a 3D convolutional RNN, so I thought a simple 4D CNN (with kernel size 1 in the time dimension) could be a better idea. Do you want to keep the issue open as a feature request or should I close it?

@MikeInnes
Copy link
Member

Yeah, we may as well keep this open, thanks.

@shreyas-kowshik
Copy link
Contributor

I would like to work on this issue. Can someone please guide me as to what needs to be done?

@MikeInnes
Copy link
Member

Might be best to speak to @avik-pal. On the GPU we're limited by what NVIDIA provides in CUDNN, but it should be straightforward to write an N-dimensional CPU kernel.

@avik-pal
Copy link
Member

avik-pal commented Feb 7, 2019

Unfortunately, CUDNN only provides 2D and 3D convolutions.

@shreyas-kowshik
Copy link
Contributor

@avik-pal So should I go ahead with the CPU part? And if yes can you please guide me as to where should I start looking into the code?

@avik-pal
Copy link
Member

avik-pal commented Feb 7, 2019

@shreyas-kowshik you will first have to add it in NNlib. Here are the 2d and 3d implementations.

@datnamer
Copy link

datnamer commented Feb 7, 2019

Is it not possible to write a nd conv on the GPU?

@avik-pal
Copy link
Member

avik-pal commented Feb 7, 2019

@datnamer It is definitely possible. We need to write the CUDA kernel with CUDAnative and get it integrated with the Flux API. However, making the kernel efficient is not as straightforward as optimizing the CPU kernel.

@datnamer
Copy link

datnamer commented Feb 7, 2019

Understood

@shreyas-kowshik
Copy link
Contributor

I was going through some code in NNlib.jl when I came across this :
FluxML/NNlib.jl#31

The PR effectively adds N-dimensional Conv to NNlib.jl

@cossio
Copy link
Contributor

cossio commented Aug 1, 2019

Should this be closed since FluxML/NNlib.jl#94 has been merged?

@datnamer
Copy link

datnamer commented Aug 1, 2019

does that implementation work on GPUs?

@bhvieira
Copy link
Contributor Author

bhvieira commented Aug 2, 2019

@cossio it doesn't appear to work though. What should be the syntax to get (4+)D convolutions working?

@darsnack
Copy link
Member

I think this should no longer be an issue.

@AriMKatz
Copy link

Even on CUDA?

@darsnack
Copy link
Member

Not sure. Will test tomorrow morning (or someone else can race me).

@CarloLucibello
Copy link
Member

On cpu, we support only up to 3d convolutions
https://github.com/FluxML/NNlib.jl/blob/master/src/conv.jl
I suspect it is the same for CUDNN but I didn't check

@darsnack
Copy link
Member

Okay then we definitely should keep this open.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants