Skip to content

Latest commit

 

History

History
57 lines (37 loc) · 1.67 KB

CHANGELOG.md

File metadata and controls

57 lines (37 loc) · 1.67 KB

v0.4

v0.4.11

  • Introduces Lux.Training API for less clunky training loops.
  • WeightNorm cannot be invoked on a parameter with all elements equal to 0.

v0.4.10

  • Fixes Recurrent Model bug on GPUs

v0.4.8

  • Deprecations
    • ActivationFunction -- See reasoning in LuxDL#71
    • Default parameterlength / statelength / initialparameters / initialstates for certain types
    • trainmode / testmode with mode argument
    • Conv: bias deprecated in favor of use_bias

v0.4.7

  • Manual detailing Lux Interface
  • Fixes bug with ComponentArray + Optimiser FluxML/Optimisers.jl#91
  • Dropout Layers caches 1 / (1 - p) for minor improvements for forward pass
  • dropout has a custom rrule -- significantly improves performance for smaller arrays
  • WARN A bug with ComponentArray + CUDA + Optimiser usage was introduced. Please update to v0.4.8 or higher.

v0.4.6

  • Documentation revamped
  • gpu and cpu give a deprecation warning if used on an AbstractExplicitLayer

v0.4.5

  • Allow Arbitrary Parameter Types

v0.4.4

  • Updated to support julia v1.6 (test time dependency issues)

v0.4.3

  • Extending Scale to allow for multiple dimension inputs (LuxDL#40)

v0.4.2

  • SelectDim is no longer type unstable -- Internal storage for the Layer has been changed
  • Dropout & VariationalDropout return NoOpLayer if the probability of dropout is 0
  • Code Formatting -- SciMLStyle (LuxDL#31)

v0.4.1

  • Fix math rendering in docs
  • Add Setfield compat for v1.0