Skip to content

Commit

Permalink
PyGAD 2.7.0 Regression Support
Browse files Browse the repository at this point in the history
Changes in PyGAD 2.7.0 (11 September 2020):

1. The `learning_rate` parameter in the `pygad.nn.train()` function defaults to **0.01**.
2. Added support of building neural networks for regression using the new parameter named `problem_type`. It is added as a parameter to both `pygad.nn.train()` and `pygad.nn.predict()` functions. The value of this parameter can be either **classification** or **regression** to define the problem type. It defaults to **classification**.
3. The activation function for a layer can be set to the string `"None"` to refer that there is no activation function at this layer. As a result, the supported values for the activation function are `"sigmoid"`, `"relu"`, `"softmax"`, and `"None"`.

To build a regression network using the `pygad.nn` module, just do the following:

1. Set the `problem_type` parameter in the `pygad.nn.train()` and `pygad.nn.predict()` functions to the string `"regression"`.
2. Set the activation function for the output layer to the string `"None"`. This sets no limits on the range of the outputs as it will be from `-infinity` to `+infinity`. If you are sure that all outputs will be nonnegative values, then use the ReLU function.

Check the documentation of the `pygad.nn` module for an example that builds a neural network for regression. The regression example is also available at [this GitHub project](https://github.com/ahmedfgad/NumPyANN): https://github.com/ahmedfgad/NumPyANN

To build and train a regression network using the `pygad.gann` module, do the following:

1. Set the `problem_type` parameter in the `pygad.nn.train()` and `pygad.nn.predict()` functions to the string `"regression"`.
2. Set the `output_activation` parameter in the constructor of the `pygad.gann.GANN` class to `"None"`.

Check the documentation of the `pygad.gann` module for an example that builds and trains a neural network for regression. The regression example is also available at [this GitHub project](https://github.com/ahmedfgad/NeuralGenetic): https://github.com/ahmedfgad/NeuralGenetic

To build a classification network, either ignore the `problem_type` parameter or set it to `"classification"` (default value). In this case, the activation function of the last layer can be set to any type (e.g. softmax).
  • Loading branch information
ahmedfgad authored Sep 11, 2020
1 parent c08cd0e commit ac266fb
Show file tree
Hide file tree
Showing 6 changed files with 358 additions and 3 deletions.
160 changes: 160 additions & 0 deletions Fish.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,160 @@
Species,Weight,Length1,Length2,Length3,Height,Width
Bream,242,23.2,25.4,30,11.52,4.02
Bream,290,24,26.3,31.2,12.48,4.3056
Bream,340,23.9,26.5,31.1,12.3778,4.6961
Bream,363,26.3,29,33.5,12.73,4.4555
Bream,430,26.5,29,34,12.444,5.134
Bream,450,26.8,29.7,34.7,13.6024,4.9274
Bream,500,26.8,29.7,34.5,14.1795,5.2785
Bream,390,27.6,30,35,12.67,4.69
Bream,450,27.6,30,35.1,14.0049,4.8438
Bream,500,28.5,30.7,36.2,14.2266,4.9594
Bream,475,28.4,31,36.2,14.2628,5.1042
Bream,500,28.7,31,36.2,14.3714,4.8146
Bream,500,29.1,31.5,36.4,13.7592,4.368
Bream,340,29.5,32,37.3,13.9129,5.0728
Bream,600,29.4,32,37.2,14.9544,5.1708
Bream,600,29.4,32,37.2,15.438,5.58
Bream,700,30.4,33,38.3,14.8604,5.2854
Bream,700,30.4,33,38.5,14.938,5.1975
Bream,610,30.9,33.5,38.6,15.633,5.1338
Bream,650,31,33.5,38.7,14.4738,5.7276
Bream,575,31.3,34,39.5,15.1285,5.5695
Bream,685,31.4,34,39.2,15.9936,5.3704
Bream,620,31.5,34.5,39.7,15.5227,5.2801
Bream,680,31.8,35,40.6,15.4686,6.1306
Bream,700,31.9,35,40.5,16.2405,5.589
Bream,725,31.8,35,40.9,16.36,6.0532
Bream,720,32,35,40.6,16.3618,6.09
Bream,714,32.7,36,41.5,16.517,5.8515
Bream,850,32.8,36,41.6,16.8896,6.1984
Bream,1000,33.5,37,42.6,18.957,6.603
Bream,920,35,38.5,44.1,18.0369,6.3063
Bream,955,35,38.5,44,18.084,6.292
Bream,925,36.2,39.5,45.3,18.7542,6.7497
Bream,975,37.4,41,45.9,18.6354,6.7473
Bream,950,38,41,46.5,17.6235,6.3705
Roach,40,12.9,14.1,16.2,4.1472,2.268
Roach,69,16.5,18.2,20.3,5.2983,2.8217
Roach,78,17.5,18.8,21.2,5.5756,2.9044
Roach,87,18.2,19.8,22.2,5.6166,3.1746
Roach,120,18.6,20,22.2,6.216,3.5742
Roach,0,19,20.5,22.8,6.4752,3.3516
Roach,110,19.1,20.8,23.1,6.1677,3.3957
Roach,120,19.4,21,23.7,6.1146,3.2943
Roach,150,20.4,22,24.7,5.8045,3.7544
Roach,145,20.5,22,24.3,6.6339,3.5478
Roach,160,20.5,22.5,25.3,7.0334,3.8203
Roach,140,21,22.5,25,6.55,3.325
Roach,160,21.1,22.5,25,6.4,3.8
Roach,169,22,24,27.2,7.5344,3.8352
Roach,161,22,23.4,26.7,6.9153,3.6312
Roach,200,22.1,23.5,26.8,7.3968,4.1272
Roach,180,23.6,25.2,27.9,7.0866,3.906
Roach,290,24,26,29.2,8.8768,4.4968
Roach,272,25,27,30.6,8.568,4.7736
Roach,390,29.5,31.7,35,9.485,5.355
Whitefish,270,23.6,26,28.7,8.3804,4.2476
Whitefish,270,24.1,26.5,29.3,8.1454,4.2485
Whitefish,306,25.6,28,30.8,8.778,4.6816
Whitefish,540,28.5,31,34,10.744,6.562
Whitefish,800,33.7,36.4,39.6,11.7612,6.5736
Whitefish,1000,37.3,40,43.5,12.354,6.525
Parkki,55,13.5,14.7,16.5,6.8475,2.3265
Parkki,60,14.3,15.5,17.4,6.5772,2.3142
Parkki,90,16.3,17.7,19.8,7.4052,2.673
Parkki,120,17.5,19,21.3,8.3922,2.9181
Parkki,150,18.4,20,22.4,8.8928,3.2928
Parkki,140,19,20.7,23.2,8.5376,3.2944
Parkki,170,19,20.7,23.2,9.396,3.4104
Parkki,145,19.8,21.5,24.1,9.7364,3.1571
Parkki,200,21.2,23,25.8,10.3458,3.6636
Parkki,273,23,25,28,11.088,4.144
Parkki,300,24,26,29,11.368,4.234
Perch,5.9,7.5,8.4,8.8,2.112,1.408
Perch,32,12.5,13.7,14.7,3.528,1.9992
Perch,40,13.8,15,16,3.824,2.432
Perch,51.5,15,16.2,17.2,4.5924,2.6316
Perch,70,15.7,17.4,18.5,4.588,2.9415
Perch,100,16.2,18,19.2,5.2224,3.3216
Perch,78,16.8,18.7,19.4,5.1992,3.1234
Perch,80,17.2,19,20.2,5.6358,3.0502
Perch,85,17.8,19.6,20.8,5.1376,3.0368
Perch,85,18.2,20,21,5.082,2.772
Perch,110,19,21,22.5,5.6925,3.555
Perch,115,19,21,22.5,5.9175,3.3075
Perch,125,19,21,22.5,5.6925,3.6675
Perch,130,19.3,21.3,22.8,6.384,3.534
Perch,120,20,22,23.5,6.11,3.4075
Perch,120,20,22,23.5,5.64,3.525
Perch,130,20,22,23.5,6.11,3.525
Perch,135,20,22,23.5,5.875,3.525
Perch,110,20,22,23.5,5.5225,3.995
Perch,130,20.5,22.5,24,5.856,3.624
Perch,150,20.5,22.5,24,6.792,3.624
Perch,145,20.7,22.7,24.2,5.9532,3.63
Perch,150,21,23,24.5,5.2185,3.626
Perch,170,21.5,23.5,25,6.275,3.725
Perch,225,22,24,25.5,7.293,3.723
Perch,145,22,24,25.5,6.375,3.825
Perch,188,22.6,24.6,26.2,6.7334,4.1658
Perch,180,23,25,26.5,6.4395,3.6835
Perch,197,23.5,25.6,27,6.561,4.239
Perch,218,25,26.5,28,7.168,4.144
Perch,300,25.2,27.3,28.7,8.323,5.1373
Perch,260,25.4,27.5,28.9,7.1672,4.335
Perch,265,25.4,27.5,28.9,7.0516,4.335
Perch,250,25.4,27.5,28.9,7.2828,4.5662
Perch,250,25.9,28,29.4,7.8204,4.2042
Perch,300,26.9,28.7,30.1,7.5852,4.6354
Perch,320,27.8,30,31.6,7.6156,4.7716
Perch,514,30.5,32.8,34,10.03,6.018
Perch,556,32,34.5,36.5,10.2565,6.3875
Perch,840,32.5,35,37.3,11.4884,7.7957
Perch,685,34,36.5,39,10.881,6.864
Perch,700,34,36,38.3,10.6091,6.7408
Perch,700,34.5,37,39.4,10.835,6.2646
Perch,690,34.6,37,39.3,10.5717,6.3666
Perch,900,36.5,39,41.4,11.1366,7.4934
Perch,650,36.5,39,41.4,11.1366,6.003
Perch,820,36.6,39,41.3,12.4313,7.3514
Perch,850,36.9,40,42.3,11.9286,7.1064
Perch,900,37,40,42.5,11.73,7.225
Perch,1015,37,40,42.4,12.3808,7.4624
Perch,820,37.1,40,42.5,11.135,6.63
Perch,1100,39,42,44.6,12.8002,6.8684
Perch,1000,39.8,43,45.2,11.9328,7.2772
Perch,1100,40.1,43,45.5,12.5125,7.4165
Perch,1000,40.2,43.5,46,12.604,8.142
Perch,1000,41.1,44,46.6,12.4888,7.5958
Pike,200,30,32.3,34.8,5.568,3.3756
Pike,300,31.7,34,37.8,5.7078,4.158
Pike,300,32.7,35,38.8,5.9364,4.3844
Pike,300,34.8,37.3,39.8,6.2884,4.0198
Pike,430,35.5,38,40.5,7.29,4.5765
Pike,345,36,38.5,41,6.396,3.977
Pike,456,40,42.5,45.5,7.28,4.3225
Pike,510,40,42.5,45.5,6.825,4.459
Pike,540,40.1,43,45.8,7.786,5.1296
Pike,500,42,45,48,6.96,4.896
Pike,567,43.2,46,48.7,7.792,4.87
Pike,770,44.8,48,51.2,7.68,5.376
Pike,950,48.3,51.7,55.1,8.9262,6.1712
Pike,1250,52,56,59.7,10.6863,6.9849
Pike,1600,56,60,64,9.6,6.144
Pike,1550,56,60,64,9.6,6.144
Pike,1650,59,63.4,68,10.812,7.48
Smelt,6.7,9.3,9.8,10.8,1.7388,1.0476
Smelt,7.5,10,10.5,11.6,1.972,1.16
Smelt,7,10.1,10.6,11.6,1.7284,1.1484
Smelt,9.7,10.4,11,12,2.196,1.38
Smelt,9.8,10.7,11.2,12.4,2.0832,1.2772
Smelt,8.7,10.8,11.3,12.6,1.9782,1.2852
Smelt,10,11.3,11.8,13.1,2.2139,1.2838
Smelt,9.9,11.3,11.8,13.1,2.2139,1.1659
Smelt,9.8,11.4,12,13.2,2.2044,1.1484
Smelt,12.2,11.5,12.2,13.4,2.0904,1.3936
Smelt,13.4,11.7,12.4,13.5,2.43,1.269
Smelt,12.2,12.1,13,13.8,2.277,1.2558
Smelt,19.7,13.2,14.3,15.2,2.8728,2.0672
Smelt,19.9,13.8,15,16.2,2.9322,1.8792
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# NumPyANN: Building Neural Networks using NumPy

[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is a Python project for training neural networks using NumPy.
[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is a Python project for building artificial neural networks using NumPy.

[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is part of [PyGAD](https://pypi.org/project/pygad) which is an open-source Python 3 library for implementing the genetic algorithm and optimizing machine learning algorithms.
[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is part of [PyGAD](https://pypi.org/project/pygad) which is an open-source Python 3 library for implementing the genetic algorithm and optimizing machine learning algorithms. Both regression and classification neural networks are supported starting from PyGAD 2.7.0.

Check documentation of the [NeuralGenetic](https://github.com/ahmedfgad/NeuralGenetic) project in the PyGAD's documentation: https://pygad.readthedocs.io/en/latest/README_pygad_nn_ReadTheDocs.html

The library is under active development and more features in the genetic algorithm will be added like working with binary problems. This is in addition to supporting more machine learning algorithms.
The library is under active development and more features are added regularly. If you want a feature to be supported, please check the **Contact Us** section to send a request.

Before using [NumPyANN](https://github.com/ahmedfgad/NumPyCNN), install PyGAD.

Expand Down
51 changes: 51 additions & 0 deletions example_XOR_classification.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
import numpy
import pygad.nn

"""
This project creates a neural network where the architecture has input and dense layers only. More layers will be added in the future.
The project only implements the forward pass of a neural network and no training algorithm is used.
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
"""

# Preparing the NumPy array of the inputs.
data_inputs = numpy.array([[1, 1],
[1, 0],
[0, 1],
[0, 0]])

# Preparing the NumPy array of the outputs.
data_outputs = numpy.array([0,
1,
1,
0])

# The number of inputs (i.e. feature vector length) per sample
num_inputs = data_inputs.shape[1]
# Number of outputs per sample
num_outputs = 2

HL1_neurons = 2

# Building the network architecture.
input_layer = pygad.nn.InputLayer(num_inputs)
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer1, activation_function="softmax")

# Training the network.
pygad.nn.train(num_epochs=100,
last_layer=output_layer,
data_inputs=data_inputs,
data_outputs=data_outputs,
learning_rate=0.01)

# Using the trained network for predictions.
predictions = pygad.nn.predict(last_layer=output_layer, data_inputs=data_inputs)

# Calculating some statistics
num_wrong = numpy.where(predictions != data_outputs)[0]
num_correct = data_outputs.size - num_wrong.size
accuracy = 100 * (num_correct/data_outputs.size)
print("Number of correct classifications : {num_correct}.".format(num_correct=num_correct))
print("Number of wrong classifications : {num_wrong}.".format(num_wrong=num_wrong.size))
print("Classification accuracy : {accuracy}.".format(accuracy=accuracy))
51 changes: 51 additions & 0 deletions example_classification.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
import numpy
import pygad.nn

"""
This project creates a neural network where the architecture has input and dense layers only. More layers will be added in the future.
The project only implements the forward pass of a neural network and no training algorithm is used.
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
"""

# Reading the data features. Check the 'extract_features.py' script for extracting the features & preparing the outputs of the dataset.
data_inputs = numpy.load("dataset_features.npy") # Download from https://github.com/ahmedfgad/NumPyANN/blob/master/dataset_features.npy

# Optional step for filtering the features using the standard deviation.
features_STDs = numpy.std(a=data_inputs, axis=0)
data_inputs = data_inputs[:, features_STDs > 50]

# Reading the data outputs. Check the 'extract_features.py' script for extracting the features & preparing the outputs of the dataset.
data_outputs = numpy.load("outputs.npy") # Download from https://github.com/ahmedfgad/NumPyANN/blob/master/outputs.npy

# The number of inputs (i.e. feature vector length) per sample
num_inputs = data_inputs.shape[1]
# Number of outputs per sample
num_outputs = 4

HL1_neurons = 150
HL2_neurons = 60

# Building the network architecture.
input_layer = pygad.nn.InputLayer(num_inputs)
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
hidden_layer2 = pygad.nn.DenseLayer(num_neurons=HL2_neurons, previous_layer=hidden_layer1, activation_function="relu")
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer2, activation_function="softmax")

# Training the network.
pygad.nn.train(num_epochs=10,
last_layer=output_layer,
data_inputs=data_inputs,
data_outputs=data_outputs,
learning_rate=0.01)

# Using the trained network for predictions.
predictions = pygad.nn.predict(last_layer=output_layer, data_inputs=data_inputs)

# Calculating some statistics
num_wrong = numpy.where(predictions != data_outputs)[0]
num_correct = data_outputs.size - num_wrong.size
accuracy = 100 * (num_correct/data_outputs.size)
print("Number of correct classifications : {num_correct}.".format(num_correct=num_correct))
print("Number of wrong classifications : {num_wrong}.".format(num_wrong=num_wrong.size))
print("Classification accuracy : {accuracy}.".format(accuracy=accuracy))
46 changes: 46 additions & 0 deletions example_regression.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
import numpy
import pygad.nn

"""
This example creates a neural network for regression where the architecture has input and dense layers only. More layers will be added in the future.
The project only implements the forward pass of a neural network and no training algorithm is used.
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
"""

# Preparing the NumPy array of the inputs.
data_inputs = numpy.array([[2, 5, -3, 0.1],
[8, 15, 20, 13]])

# Preparing the NumPy array of the outputs.
data_outputs = numpy.array([0.1,
1.5])

# The number of inputs (i.e. feature vector length) per sample
num_inputs = data_inputs.shape[1]
# Number of outputs per sample
num_outputs = 1

HL1_neurons = 2

# Building the network architecture.
input_layer = pygad.nn.InputLayer(num_inputs)
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer1, activation_function="None")

# Training the network.
pygad.nn.train(num_epochs=100,
last_layer=output_layer,
data_inputs=data_inputs,
data_outputs=data_outputs,
learning_rate=0.01,
problem_type="regression")

# Using the trained network for predictions.
predictions = pygad.nn.predict(last_layer=output_layer,
data_inputs=data_inputs,
problem_type="regression")

# Calculating some statistics
abs_error = numpy.mean(numpy.abs(predictions - data_outputs))
print("Absolute error : {abs_error}.".format(abs_error=abs_error))
47 changes: 47 additions & 0 deletions example_regression_fish.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import numpy
import pygad.nn
import pandas

"""
This example creates a neural network for regression where the architecture has input and dense layers only. More layers will be added in the future.
The project only implements the forward pass of a neural network and no training algorithm is used.
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
"""

data = numpy.array(pandas.read_csv("Fish.csv"))

# Preparing the NumPy array of the inputs.
data_inputs = numpy.asarray(data[:, 2:], dtype=numpy.float32)

# Preparing the NumPy array of the outputs.
data_outputs = numpy.asarray(data[:, 1], dtype=numpy.float32) # Fish Weight

# The number of inputs (i.e. feature vector length) per sample
num_inputs = data_inputs.shape[1]
# Number of outputs per sample
num_outputs = 1

HL1_neurons = 2

# Building the network architecture.
input_layer = pygad.nn.InputLayer(num_inputs)
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer1, activation_function="None")

# Training the network.
pygad.nn.train(num_epochs=100,
last_layer=output_layer,
data_inputs=data_inputs,
data_outputs=data_outputs,
learning_rate=0.01,
problem_type="regression")

# Using the trained network for predictions.
predictions = pygad.nn.predict(last_layer=output_layer,
data_inputs=data_inputs,
problem_type="regression")

# Calculating some statistics
abs_error = numpy.mean(numpy.abs(predictions - data_outputs))
print("Absolute error : {abs_error}.".format(abs_error=abs_error))

0 comments on commit ac266fb

Please sign in to comment.