Radial-Basis-Function Network using Gradient descent to adjust RBF-centers, RBF-widths and weights. Implemented in python, for a programming assignment of course Technical Neuroal Network.
Simply import function "RBFNetwork" from RBFNetwork.py
$ python
from RBFNetwork import RBFNetwork
Now you can use this function with the following definition:
Errors, TNN = RBFNetwork (N, M, K, Weights, Patterns, LearningRateC, LearningRateS, LearningRateW, RandomSeed, MaxSteps, Batch)
Number of inputs, outputs and RBFs.
Weights: an array consist of M arrays, for each output neuron, in which there are k numbers for weights of each RBF neuron, Alternatively, direction of a .dat file in which you must put weights with the aforementioned order (lines with # consider as a comment), Alternatively, '' or [] for random initialization with vlues between -0.5 and 0.5
A list of P patterns used to train the perzeptron in the form of tuples in which the first element is a list containing N inputs and the second one is a list containing M outputs. Alternatively, source direction of a .dat file in which for each training pattern you must put the inputs followed by outputs followed by next patterns (lines with # consider as a comment).
Learning rate dor adjusting Centers, Widths and Weights.
A random seed used for random initializing and shuffling, to be able to reproduce results.
The maximum number of iterations in which the model can train.
A boolean value which is 1 for batch learning and 0 for single-step learning
A list containing the squared error of each pattern in each of the iterations.
A function which is a MLP model with calculated weights, with the following definition:
Y = TNN(X)
X: a list of containing N values as input Y: a list containing M values as output
Please check "RBFNetwork-Test.py" for an example.
To plot the learning curve of the model using gnuplot, a function called gnuplotOut which makes a file readable for gnuplot had been implemented. Check the example for more details.
Ali Mohammadi
Rozhin Bayati
Best Regards