Releases: LuxDL/Lux.jl
Releases · LuxDL/Lux.jl
v0.4.19
Lux v0.4.19
Diff since v0.4.18
Closed issues:
Named Layers for Container Types (#79 )
CUDNNError during backpropagation in simple CNN (#141 )
Merged pull requests:
v0.4.17
Lux v0.4.17
Diff since v0.4.16
Closed issues:
Inconsistent descripition of PairwiseFusion
(#130 )
No method matching
with argument IRTools.Inner.Undefined
in gradient computation. (#134 )
Merged pull requests:
LSTM docs: don't go over first element in sequence twice (#132 ) (@visr )
fix PairwiseFusion docs (#133 ) (@MilkshakeForReal)
Generic recurrent cells (#136 ) (@jumerckx )
relu tests with finite diff is too unreliable (#137 ) (@avik-pal )
v0.4.13
Lux v0.4.13
Diff since v0.4.12
Closed issues:
Make it easier to pass empty state st = (;)
(#118 )
is there transposed convolution (#122 )
Support for multidimensional data? (#123 )
Merged pull requests:
v0.4.12
Lux v0.4.12
Diff since v0.4.11
Closed issues:
optimising parameters with Optimization.jl (#108 )
add OrdinaryDiffEq downstream test (#110 )
Merged pull requests:
v0.4.11
Lux v0.4.11
Diff since v0.4.10
Closed issues:
WeightNorm causes NaN for Conv layer gradients (#95 )
Merged pull requests:
[LuxTraining] Wrappers for less clunky training loops (#104 ) (@avik-pal )
Fixes WeightNorm with zero Parameter bug (#106 ) (@avik-pal )
You can’t perform that action at this time.