Main steps for neural network data fitting with the Pybrain library

Pybrain is often claimed to be one of the best Python libraries for neural networks. However, in reality, Scikit-Learn is widely recognized as the top machine learning library for Python, but it doesn’t include a neural network module, so I don’t use it for this purpose.

I’ve come across references to another library called Neurolab before and plan to try it later. It seems that its supported neural network types are different from what I need.

The documentation for Pybrain is well-written, but I’m not interested in the examples provided. The official examples focus on classification tasks, while I’m more interested in regression or data fitting problems.

Additionally, the documentation for individual functions isn’t always complete. Sometimes, you have to rely on the Python shell with the help() function or even read the source code directly to understand how things work.

Well, let's get back to the main task. The process will probably involve the following steps:

  • Constructing a neural network
  • Creating a dataset
  • Training the neural network
  • Visualizing the results
  • Validating and analyzing the performance

Building the Neural Network

I chose to build the neural network manually rather than using a pre-built model. For this example, I'm setting up a simple feedforward network. Here’s how I did it:

From pybrain.structure import *

# Creating the neural network

fnn = FeedForwardNetwork()

# Setting up layers: input (3 neurons), hidden (7 neurons), output (1 neuron)

inLayer = LinearLayer(3)

hiddenLayer = SigmoidLayer(7)

outLayer = LinearLayer(1)

# Adding the layers to the network

fnn.addInputModule(inLayer)

fnn.addModule(hiddenLayer)

fnn.addOutputModule(outLayer)

# Creating connections between layers

in_to_hidden = FullConnection(inLayer, hiddenLayer)

hidden_to_out = FullConnection(hiddenLayer, outLayer)

# Adding the connections to the network

fnn.addConnection(in_to_hidden)

fnn.addConnection(hidden_to_out)

# Final step: sorting modules

fnn.sortModules()

Creating the Dataset

For the dataset, I used SupervisedDataSet, which is suitable for supervised learning tasks. You can also experiment with other dataset types if needed.

From pybrain.supervised.trainers import BackpropTrainer

# Defining the dataset structure: 3 inputs, 1 output

DS = SupervisedDataSet(3, 1)

# Adding sample data points

# Assuming x1, x2, x3 are input vectors and y is the output vector, all of the same length

for i in range(len(y)):

DS.addSample([x1[i], x2[i], x3[i]], [y[i]])

# Accessing the input and target values

X = DS['input']

Y = DS['target']

# Splitting the dataset into training and test sets (80% training, 20% testing)

dataTrain, dataTest = DS.splitWithProportion(0.8)

xTrain, yTrain = dataTrain['input'], dataTrain['target']

xTest, yTest = dataTest['input'], dataTest['target']

That concludes the dataset creation part.

Training the Neural Network

As the saying goes, 80% of the work is done in 20% of the time. In this case, the most important part is the training code. However, when using someone else's code, it's hard to know exactly what's happening under the hood, but it's still useful for getting things done quickly.

From pybrain.supervised.trainers import BackpropTrainer

# Using the backpropagation trainer

# verbose=True will print the total error during training

# By default, the split between training and validation set is 4:1

trainer = BackpropTrainer(fnn, dataTrain, verbose=True, learningrate=0.01)

# Training until convergence

trainer.trainUntilConvergence(maxEpochs=1000)

Visualizing the Results

Data visualization is usually handled with Pylab or Matplotlib. You can find many examples online about how to plot results in Python.

Validation and Analysis

After training, I like to test the model with a random sample from the test set to see how it performs:

Import random

# Randomly select an index from the test set

c = random.randint(0, xTest.shape[0])

# Get the corresponding input and target value

X2 = xTest[c, :]

Prediction = fnn.activate(X2)

# Print the true value, prediction, and error

print('True value is: ' + str(yTest[c]), 'Prediction is: ' + str(Prediction), 'Error: ' + str((Prediction - yTest[c])/yTest[c]))

You can also print out the weights of the network to analyze its internal structure. This code was found on Stack Overflow and is very helpful for debugging:

for mod in fnn.modules:

print("Module:", mod.name)

if mod.paramdim > 0:

print("--Parameters:", mod.params)

for conn in fnn.connections[mod]:

print("- Connection to", conn.outmod.name)

if conn.paramdim > 0:

print("- Parameters", conn.params)

if hasattr(fnn, "recurrentConns"):

print("Recurrent connections")

for conn in fnn.recurrentConns:

print("- ", conn.inmod.name, "to", conn.outmod.name)

if conn.paramdim > 0:

print("- Parameters", conn.params)

To measure the program’s performance, I often use the time module to track how long certain parts take:

import time

# Start timer before the code you want to measure

start = time.clock()

# End timer after the code

elapsed = (time.clock() - start)

print("Time used: " + str(elapsed))

If you need statistical analysis, you can either write custom functions or use built-in tools in the package, such as mean squared error (MSE) calculations.

Tubular Pole

In order to meet the individual requirements of the clients, we offer an extensive range of tubular steel poles and tubular poles that are acclaimed among the clients for durable standards and exceptional finishing. We offer them in standard sizes and durable in nature. Steel Tubular Poles are corrosion resistant in nature and is delivered in well-defined time frame.

1.Shape:Conoid ,Multi-pyramidal,Columniform,polygonal or conical

2.Material:steel plate.stainless steel compound plate,stainless steel plate,ect.(anticorrosion treatment with hot galvanization,also color polyester power could be coated on the surface)

High strength low alloy steel Q235,Q345,GR65,GR50 to ensure the mechanical properity of microelement in order to ensure the quality of galvanization (other materials are also avaliable on request)

3.Jointing of pole with insert mode,innerflange mode,face to face joint mode

4.Design of pole :against earthquake of 8 grade ,aganist wind pressure of 160

5.Minimum yield strength:355 mpa

6.Minimum ultimate tensile strength :490 mpa

7.Max ultimate tensilestrength:620 mpa

8.Certificate:ISO9001-2000

9.Length:Within 14m once forming without slip joint

10.Welding:It has past flaw testing.Internal and external double welding makes the welding beautiful in shape

11:Packages:Our poles as normal cover by Mat or straw bale at the top and bottom ,anyway also can following by client required , each 40HC or OT can loading how many pcs will calculation base on the client actually specification and data

Our lighting equipment are made from quality sheet from bending,forming,automatic welding and hot galvanization.We can reach one-run machining length of 14m,and can bend sheet thickness up to 25mm.We adopt advanced welding procedures ,automatically weld main joints and reach rank-II welding quality.

Sodium lamp high mast

Steel Tubular Pole,Galvanized Tubular Steel,Steel Tubular Pole Tower

JIANGSU XINJINLEI STEEL INDUSTRY CO.,LTD , https://www.chinasteelpole.com