A neural network (NN) is a powerful tool for data classification: if you have a source of input data and know how the output looks like, one can estimate relationships between input and output.
A neural net can be constructed using the class HNeuralNet which is based on the Encog project. Let us build a feed-forward neural net with with input layer, 2 hidden layers and one output layer. The input layer will contain 4 neurons, two hidden layers will have 5 neurons each, and the output layer will have 1 neuron:
>>> from jhplot import * >>> net = HNeuralNet() >>> net.addFeedForwardLayer(4) >>> net.addFeedForwardLayer(5) >>> net.addFeedForwardLayer(5) >>> net.addFeedForwardLayer(1)
Below we show how to initialise the network and display it
>>> net.reset() >>> net.showNetwork()
Next, the created network should learn how to adjust the weights and threshold for all neurons. This is done by applying the method:
>>> net.trainBackpopogation(b,max,learnRate,momentum,errEpoch)where:
It should be noted that the epoch error can also be obtained with the method net.getEpochError() without opening the pop-up window.
It is always a good idea to call showNetwork() to look at the new threshold values.
There are several examples of how to use the NN. Here is a complete sets of examples:
NN_data.py - prepare data sample
NN_scaled.py - rescale the sample
NN_train.py - train using back-propogation
NN_forecast.py - make predictions
NN_analyse.py - make detailed analysis of predictions
jHepWork contains the Joone package [18] which can be used to create a neural network (NN) framework, train and test artificial neural networks. jHepWork attempts to use this package using Jython scripts.
A complete example of how to train and verify the trained NN is located in the directory neural_net: Click on the file NN_train.py, and run it using the jHepWork editor. You will see a new tab with the log file, which contains the result of the NN training. This script reads the file wine_train.txt and the NN is trained using the first 150 raws of the data in this file. The first column represents the output, while other columns are input data. Note that the input data are normalized by the script. Then the NN is tested using 28 raws of the lines, and the result of this test is again written to the log file.
This NN_train.py script writes the NN in the serialized file nn_wine.snet. To perform a forecast using the set of the data in the file wine_forecast.txt, run the scrip NN_forecast.py (in this file, the output numbers are all set to zero).
This example is rather similar to that located in the file Validation_using_stream.java of the directory samples/engine/helpers of the Joone package. There are a lot of explanations in the jHepWork NN scripts, but you still should read the API of Joone packagehttp://www.jooneworld.com/, and especially its manual. The NN scripts were build using the package "org.joone.samples.engine.helpers"
The Bayesian network is included to jHepWork using the JavaBayes package. It calculates marginal probabilities and expectations, produces explanations and performs robustness analysis. In addition, it allows the user to import, create, modify and export networks.
The network editor can be called using the HBayes class:
>>> from jhplot import * >>> HBayes()
This will pop-up Bayesian-network editor and a console window. A user can follow the step-by-step instruction given in the console. The network is rather well documented and we will not discuss it here.