Subsections

3.33 Neural Networks

A neural network (NN) is a powerful tool for data classification: if you have a source of input data and know how the output looks like, one can estimate relationships between input and output.

A neural net can be constructed using the class HNeuralNet which is based on the Encog project. Let us build a feed-forward neural net with with input layer, 2 hidden layers and one output layer. The input layer will contain 4 neurons, two hidden layers will have 5 neurons each, and the output layer will have 1 neuron:

>>> from jhplot import *
>>> net = HNeuralNet()
>>> net.addFeedForwardLayer(4)
>>> net.addFeedForwardLayer(5)
>>> net.addFeedForwardLayer(5)
>>> net.addFeedForwardLayer(1)

Below we show how to initialise the network and display it

>>> net.reset()
>>> net.showNetwork()

Next, the created network should learn how to adjust the weights and threshold for all neurons. This is done by applying the method:

>>> net.trainBackpopogation(b,max,learnRate,momentum,errEpoch)
where:

b
- is true ("1") is a pop-up window is required which shows the learning rate and learning error. The smaller error on Y, the high chances that the learning is successful. Make sure that the error value does not change with iteration (or epoch) number. If false ("0"), you cannot monitor the learning rate and error.

max
- is the maximal number of epoch for learning. The learning continues until max is reached, but only if the epoch error is larger than errEpoch. If we reach a maximum count of iterations or epoch, this will mean that the training was not successful.

learnRate
- is the rate at which the weight matrix will be adjusted based on learning (the sol-called learning rate); This number is usually 0.1-0.2.

momentum
is the influence that previous iteration's training deltas will have on the current iteration. Usually it is set to 0.1-0.4.

errEpoch
is the error for training at which the learning should be stoped. If the specified error is not reached during the learning, the program stops the learning after reaching "max" number of events.

It should be noted that the epoch error can also be obtained with the method net.getEpochError() without opening the pop-up window.

It is always a good idea to call showNetwork() to look at the new threshold values.

There are several examples of how to use the NN. Here is a complete sets of examples:

NN_data.py - prepare data sample

NN_scaled.py - rescale the sample

NN_train.py - train using back-propogation

NN_forecast.py - make predictions

NN_analyse.py - make detailed analysis of predictions

3.33.1 Joone-based neural net

jHepWork contains the Joone package [18] which can be used to create a neural network (NN) framework, train and test artificial neural networks. jHepWork attempts to use this package using Jython scripts.

A complete example of how to train and verify the trained NN is located in the directory neural_net: Click on the file NN_train.py, and run it using the jHepWork editor. You will see a new tab with the log file, which contains the result of the NN training. This script reads the file wine_train.txt and the NN is trained using the first 150 raws of the data in this file. The first column represents the output, while other columns are input data. Note that the input data are normalized by the script. Then the NN is tested using 28 raws of the lines, and the result of this test is again written to the log file.

This NN_train.py script writes the NN in the serialized file nn_wine.snet. To perform a forecast using the set of the data in the file wine_forecast.txt, run the scrip NN_forecast.py (in this file, the output numbers are all set to zero).

This example is rather similar to that located in the file Validation_using_stream.java of the directory samples/engine/helpers of the Joone package. There are a lot of explanations in the jHepWork NN scripts, but you still should read the API of Joone packagehttp://www.jooneworld.com/, and especially its manual. The NN scripts were build using the package "org.joone.samples.engine.helpers"

3.33.2 Bayesian networks

A Bayesian network (or Belief network) is a graphical model for manipulating probabilistic relationships among variables of interest, and building decision scenarios requiring reasoning under uncertainty. Such networks are widely used in managing uncertainty in science and engineering.

The Bayesian network is included to jHepWork using the JavaBayes package. It calculates marginal probabilities and expectations, produces explanations and performs robustness analysis. In addition, it allows the user to import, create, modify and export networks.

The network editor can be called using the HBayes class:

>>> from jhplot import *
>>> HBayes()

This will pop-up Bayesian-network editor and a console window. A user can follow the step-by-step instruction given in the console. The network is rather well documented and we will not discuss it here.