Skip to content

1.4.0

Compare
Choose a tag to compare
@cofri cofri released this 10 Jan 17:16
· 96 commits to master since this release
9c2f1a7

New features and improvements

  • Two new layers:
    • SpectralConv2DTranspose, a Lipschitz version of the Keras Conv2DTranspose layer
    • activation layer Householder which is a parametrized generalization of the GroupSort2
  • Two new regularizers to foster orthogonality:
    • LorthRegularizer for an orthogonal convolution
    • OrthDenseRegularizer for an orthogonal Dense matrix kernel
  • Two new losses for Lipschitz networks:
    • TauCategoricalCrossentropy, a categorical cross-entropy loss with temperature scaling tau
    • CategoricalHinge, a hinge loss for multi-class problems based on the implementation of the Keras CategoricalHinge
  • Two new custom callbacks:
    • LossParamScheduler to change loss hyper-parameters during training, e.g. min_margin, alpha and tau
    • LossParamLog to log the value of loss parameters
  • The Björck orthogonalization algorithm was accelerated.
  • Normalizers (power iteration and Björck) use tf.while_loop and the swap_memory argument can be globally set using set_swap_memory(bool). Default value is True to save memory usage in GPU.
  • The new function set_stop_grad_spectral(bool) allows to bypass the back-propagation in the power iteration algorithm that computes the spectral norm. Default value is True. Stopping gradient propagation reduces runtime.
  • Due to bugs in TensorFlow serialization of custom losses and metrics (version 2.0 and 2.1), deel-lip now only supports TensorFlow >= 2.2.

Fixes

  • SpectralInitializer does not reuse anymore the same base initializer in multiple instances.

Full Changelog: v1.3.0...v1.4.0