Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] mode parameter for Convolution/cross-convolution #71

Merged
merged 5 commits into from
Oct 8, 2018

Conversation

ayush1999
Copy link
Contributor

@ayush1999 ayush1999 commented Oct 4, 2018

@MikeInnes Would expressing the mode argument work in each function call work, or do I need to add another crosscor function?

@ayush1999
Copy link
Contributor Author

Also, there was some discussion in FluxML/Flux.jl#308 about the way to express these changes in Flux. Could you give me a final decision on it so that I can go ahead and add them to Flux?

@MikeInnes
Copy link
Member

We can pass mode around in the implementation, but let's not expose it in the conv or conv! interface. Instead let's add crosscor and crosscor!; these should both be trivial definitions that forward to conv(!) with the mode changed.

src/conv.jl Outdated
function crossconv!(y::AbstractArray{T,3}, x::AbstractArray{T,3}, w::AbstractArray{T,3};
pad = 0, stride = 1, dilation = 1) where T
args = map(x -> reshape(x, size(x,1),1,size(x,2),size(x,3)), (y, x, w))
crossconv!(args..., pad = (pad...,0), stride = (stride...,1), dilation = (dilation...,1))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's crosscor. It should also be a thin wrapper that doesn't need as much duplication as you've added here; if you just call conv! here with the right mode, for example, you won't need specific 2D and 3D wrappers.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But conv! doesn't take the mode parameter in its arguments at the moment, (and you also advised me not to expose mode in conv!).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be ok to have it in conv! (perhaps as flipkernel) but just not document it. That makes it a bit easier to support crosscor! without a lot of duplication. You can also then avoid the gradient wrappers, since we can just support flipkernel in Flux.

@ayush1999
Copy link
Contributor Author

@MikeInnes I've removed all gradient wrappers and added the flipkernel argument to conv!.

@MikeInnes
Copy link
Member

Ok perfect. Thanks a lot!

@MikeInnes MikeInnes merged commit 519b5c2 into FluxML:master Oct 8, 2018
@jekbradbury
Copy link
Contributor

jekbradbury commented Oct 8, 2018

It still uses “crossconv”, which should instead be “crosscor”

@MikeInnes
Copy link
Member

Yeah that's fixed in 5727643

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants