We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This issue keeps track of which Flux layers in the model reference got LRP implementations.
Dense
flatten
Conv
DepthwiseConv
ConvTranspose
CrossCor
AdaptiveMaxPool
MaxPool
GlobalMaxPool
AdaptiveMeanPool
MeanPool
GlobalMeanPool
Maxout
SkipConnection
Chain
Parallel
Bilinear
Diagonal
Embedding
normalise
BatchNorm
dropout
Dropout
AlphaDropout
LayerNorm
InstanceNorm
GroupNorm
Upsample
PixelShuffle
RNN
LSTM
GRU
Recur
The text was updated successfully, but these errors were encountered:
Closed by #157.
Sorry, something went wrong.
No branches or pull requests
This issue keeps track of which Flux layers in the model reference got LRP implementations.
Basic layers
Dense
flatten
Convolution
Conv
DepthwiseConv
ConvTranspose
CrossCor
Pooling layers
AdaptiveMaxPool
MaxPool
GlobalMaxPool
AdaptiveMeanPool
MeanPool
GlobalMeanPool
General purpose
Maxout
SkipConnection
Chain
Support nested Flux Chains in LRP #119Parallel
Add LRP support forParallel
layer #10Bilinear
Diagonal
Embedding
Normalisation & regularisation
normalise
BatchNorm
EnableBatchNorm
layers in LRP #129dropout
Dropout
AlphaDropout
LayerNorm
InstanceNorm
GroupNorm
Upsampling layers
Upsample
PixelShuffle
Recurrent layers
RNN
LSTM
GRU
Recur
The text was updated successfully, but these errors were encountered: