You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am doing some experiments with the Elman Neural Network and reading the JavaDoc I noticed this incongruence.
This is what reported in the description of the class
The specified activation function will be used on all layers
and this is the description of the method setActivationFunction(ActivationFunction activation)
Set the activation function to use on each of the layers.
However, in the code, it is clearly visible that this is not true since the last layer has hard coded as activation function the value null:
public MLMethod generate() {
BasicLayer hidden, input;
final BasicNetwork network = new BasicNetwork();
network.addLayer(input = new BasicLayer(this.activation, true,
this.inputNeurons));
network.addLayer(hidden = new BasicLayer(this.activation, true,
this.hiddenNeurons));
network.addLayer(new BasicLayer(null, false, this.outputNeurons));
input.setContextFedBy(hidden);
network.getStructure().finalizeStructure();
network.reset();
return network;
}
I saw that someone else reported this in spring 2015 (without any answer) but seems like nothing changed.
Is there any reason to hard code the linear activation function on the last layer?
The text was updated successfully, but these errors were encountered:
…tivation function in pattern. Also fix elman pattern's incorrect use of an activation function on the input layer. Encog input layers do not have activation functions, as they do not have a previous layer.
I am doing some experiments with the Elman Neural Network and reading the JavaDoc I noticed this incongruence.
This is what reported in the description of the class
and this is the description of the method
setActivationFunction(ActivationFunction activation)
However, in the code, it is clearly visible that this is not true since the last layer has hard coded as activation function the value
null
:I saw that someone else reported this in spring 2015 (without any answer) but seems like nothing changed.
Is there any reason to hard code the linear activation function on the last layer?
The text was updated successfully, but these errors were encountered: