PyBrain is often considered one of the best Python libraries for neural networks. However, when it comes to machine learning in Python, Scikit-Learn is widely recognized as the top choice. The issue is that Scikit-Learn doesn't include a built-in neural network module, which is why I didn't go with it.
I've come across references to another library called Neurolab before and I plan to give it a try later. From what I understand, its supported neural network types are different from PyBrain.
The documentation for PyBrain is well-structured and easy to follow, but I don’t need the example code provided. The examples in the official documentation are mostly for classification tasks, not regression or data fitting, which is what I'm aiming for.
Additionally, some functions in the documentation lack detailed explanations. In many cases, you have to use the Python shell's help function or look directly into the source code to fully understand how certain methods work.
Well, let's get back to the main task. The process will likely be divided into the following steps: constructing a neural network, building a dataset, training the network, visualizing results, and finally verifying and analyzing the performance.
Building the Neural Network
I can either use a pre-built model or construct the network manually. For this case, I’ll build a feedforward neural network from scratch.
From pybrain.structure import *
# Creating the neural network
fnn = FeedForwardNetwork()
# Setting up three layers: input (3 neurons), hidden (7 neurons), output (1 neuron)
inLayer = LinearLayer(3)
hiddenLayer = SigmoidLayer(7)
outLayer = LinearLayer(1)
# Adding the layers to the network
fnn.addInputModule(inLayer)
fnn.addModule(hiddenLayer)
fnn.addOutputModule(outLayer)
# Connecting the layers
in_to_hidden = FullConnection(inLayer, hiddenLayer)
hidden_to_out = FullConnection(hiddenLayer, outLayer)
# Adding the connections to the network
fnn.addConnection(in_to_hidden)
fnn.addConnection(hidden_to_out)
# Finalizing the network structure
fnn.sortModules()
Creating the Dataset
For the dataset, I used the SupervisedDataSet class. You can also experiment with other options if needed.
From pybrain.supervised.trainers import BackpropTrainer
# Defining the dataset format: 3 inputs, 1 output
DS = SupervisedDataSet(3, 1)
# Adding sample points to the dataset
# Assuming x1, x2, x3 are the input vectors, and y is the output vector
for i in range(len(y)):
DS.addSample([x1[i], x2[i], x3[i]], [y[i]])
# To access the input and output values
X = DS['input']
Y = DS['target']
# Splitting the dataset into training and test sets (80% training, 20% testing)
dataTrain, dataTest = DS.splitWithProportion(0.8)
xTrain, yTrain = dataTrain['input'], dataTrain['target']
xTest, yTest = dataTest['input'], dataTest['target']
The dataset section is now complete.
Training the Neural Network
As the saying goes, 80% of the work is done in 20% of the time. In reality, the most important part is the following code.
From pybrain.supervised.trainers import BackpropTrainer
# Using the backpropagation trainer
# verbose=True shows the total error during training
trainer = BackpropTrainer(fnn, dataTrain, verbose=True, learningrate=0.01)
# Training until convergence, with a maximum of 1000 epochs
trainer.trainUntilConvergence(maxEpochs=1000)
Visualizing the Results
Data visualization is usually handled using Pylab. You can refer to this blog post for more details on Python plotting functions.
Verification and Analysis
To check the results, we can randomly select a test sample.
Import random
# Selecting a random index from the test set
c = random.randint(0, xTest.shape[0])
# Getting a random test sample
X2 = xTest[c, :]
# Activating the network to get a prediction
prediction = fnn.activate(X2)
# Printing the true value, predicted value, and error
print('True value is: ' + str(yTest[c]), 'Prediction is: ' + str(prediction), 'Error: ' + str((prediction - yTest[c])/yTest[c]))
You can print out the weights of the network using the following code, which was found on Stack Overflow.
for mod in fnn.modules:
print("Module:", mod.name)
if mod.paramdim > 0:
print("--parameters:", mod.params)
for conn in fnn.connections[mod]:
print("- connection to", conn.outmod.name)
if conn.paramdim > 0:
print("- parameters", conn.params)
if hasattr(fnn, "recurrentConns"):
print("Recurrent connections")
for conn in fnn.recurrentConns:
print("- ", conn.inmod.name, " to", conn.outmod.name)
if conn.paramdim > 0:
print("- parameters", conn.params)
To measure the program’s performance, you can use a timer.
Import time
# Start the timer
start = time.clock()
# End the timer
elapsed = (time.clock() - start)
print("Time used: " + str(elapsed))
If you need statistical analysis, you can implement your own functions or use existing tools within the package, such as mean squared error (MSE).
Poles For Electric,Power Transmission Pole,35Kv Electric Steel Pole
JIANGSU XINJINLEI STEEL INDUSTRY CO.,LTD , https://www.chinasteelpole.com