- Depthwise convolutional layer API changes from
in => mult
channel specification toin => out
channel specification, and deprecates implicitout
constructor. - New SkipConnection, which can be used to train residual neural network architectures.
- New RADAM optimiser.
- Dropout now has a
dims
argument for specifying the unbroadcast dimensions. - New ConvTranspose layer.
- New Maxout layer
- Datasets are now hash verified on download to avoid corruption.
- We now zero the initial state for RNNs.
- Normalisation can now work on arbitrary
dims
. - Many docs and bugfixes thanks to @KristofferC and others.
- NamedTuples now work like Tuples when doing
mapleaves
. - New "performance tips" section of the docs.
- The training loop is now more readable and better shows how to use the lower-level APIs.
- New AlphaDropout.
- Data.Iris makes Fisher's Iris dataset available with
Iris.labels
andIris.features
. - New InstanceNorm, as popularized by Instance Normalization: The Missing Ingredient for Fast Stylization.
- New GroupNorm, as described in Group Normalization.
- New CrossCor.
AD Changes:
det
,logdet
andlogabsdet
now have adjoints.- Support for PermuteDimsArray.
- Flux.Tracker is now its own package, in preparation for replacing it with Zygote.
Despite the heroic efforts of scholars and archeologists, pre-0.7 history is lost to the sands of time.