Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Faster ReLU support using NNPACK #55

Closed
wants to merge 2 commits into from

Conversation

americast
Copy link

@americast americast commented Jul 8, 2018

Facilitates faster ReLU using NNPACK. This requires the presence of libnnpack.so, a shared object file of NNPACK. The shared object file is to be placed manually at /src, but work is in progress to automatically take care of that using BinaryBuilder.

Currently, two tests are failing when tested locally.

return type ForwardDiff.Dual{Void,Float32,1} does not match inferred return type Union{Array{Float32,0}, ForwardDiff.Dual{Void,Float32,1}}

at https://github.com/FluxML/NNlib.jl/blob/julia-0.6/test/activation.jl#L30 (which has been removed in the latest master) and

return type Float32 does not match inferred return type Union{Array{Float32,0}, Float32}

at https://github.com/FluxML/NNlib.jl/blob/julia-0.6/test/activation.jl#L7.

@americast
Copy link
Author

Put up ReLU along with Softmax in #56.

@americast americast closed this Jul 11, 2018
ToucheSir pushed a commit that referenced this pull request Feb 13, 2023
* move ctc loss from Flux

* fixup

* trivial

* rm cpu
ToucheSir pushed a commit that referenced this pull request Feb 13, 2023
* move ctc loss from Flux

* fixup

* trivial

* rm cpu
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant