Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Activation Function on Elman Neural Network #246

Open
AleZonta opened this issue Jul 20, 2017 · 2 comments
Open

Activation Function on Elman Neural Network #246

AleZonta opened this issue Jul 20, 2017 · 2 comments

Comments

@AleZonta
Copy link

I am doing some experiments with the Elman Neural Network and reading the JavaDoc I noticed this incongruence.
This is what reported in the description of the class

The specified activation function will be used on all layers

and this is the description of the method setActivationFunction(ActivationFunction activation)

Set the activation function to use on each of the layers.

However, in the code, it is clearly visible that this is not true since the last layer has hard coded as activation function the value null:

public MLMethod generate() {
		BasicLayer hidden, input;

		final BasicNetwork network = new BasicNetwork();
		network.addLayer(input = new BasicLayer(this.activation, true,
				this.inputNeurons));
		network.addLayer(hidden = new BasicLayer(this.activation, true,
				this.hiddenNeurons));
		network.addLayer(new BasicLayer(null, false, this.outputNeurons));
		input.setContextFedBy(hidden);
		network.getStructure().finalizeStructure();
		network.reset();
		return network;
	}

I saw that someone else reported this in spring 2015 (without any answer) but seems like nothing changed.
Is there any reason to hard code the linear activation function on the last layer?

@jeffheaton
Copy link
Owner

The activation function of an Elman network could be something other than linear, such as softmax, I will extend the class to allow that.

@jeffheaton
Copy link
Owner

Same issue as #205

jeffheaton added a commit that referenced this issue Sep 3, 2017
…tivation function in pattern. Also fix elman pattern's incorrect use of an activation function on the input layer. Encog input layers do not have activation functions, as they do not have a previous layer.
@jeffheaton jeffheaton added this to the Encog v3.4.1 milestone Sep 3, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants