-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement cuDNN activation functions. #33
Conversation
- Add wrapper functions for `cudnnActivationForward` and `cudnnActivationBackward`. - Move `relu` to be backend-defined. - Implement `relu` and `relu_grad` using cuDNN, add tests. - Minor: change out-of-bounds indexing via `Dimensions.apply` to clearly throw `IndexOutOfBoundsException`. TODO: - Support `relu` for non-4D tensors. This will likely require shape padding with "1" dimensions to work with the cuDNN API. - Implement other activation functions.
} | ||
} | ||
|
||
override def tanh(x: Tensor) = x.map(s => Math.tanh(s).toFloat) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It'd be good to decouple these backend functions from Tensor
methods (map
and add_oneMinusSquare_mult
).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, I agree. We are sort things out as we encounter them.
I accidentally copied tanh's gradient.
"CUDNN_CALL(cudnnActivationBackward(\n" + | ||
" cudnnHandle, act_desc,\n" + | ||
" ", one, ", x_desc, ", res.x.data, ", x_desc, ", res.d.data, ", x_desc, ", input.x.data, ",\n", | ||
" ", zero, ", x_desc, ", inputGrad.data, "));\n" + |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As suggested by @feiwang3311: rather than using beta = 0 and input.d += inputGrad
, use beta = 1.
Same elsewhere (for conv2d, etc).
I'll take on this.
cudnnActivationForward
andcudnnActivationBackward
.TODO:
padding with "1" dimensions to work with the cuDNN API.