PyBrain – Working with Recurrent Networks

PyBrain working with recurrent Network

In this guide, we will discuss working with recurrent networks in Pybrain. Recurrent Networks is the same as feed-forward networks with the only difference that you need to remember the data at each step. The history of each step has to be saved.

We will learn how to −

  • Create a Recurrent Network
  • Adding Modules and Connection

Creating a Recurrent Network

To create a recurrent network, we will use RecurrentNetwork class as shown below −

rn.py

from pybrain.structure import RecurrentNetwork
recurrentn = RecurrentNetwork()
print(recurrentn)

python rn.py

C:\pybrain\pybrain\src>python rn.py
RecurrentNetwork-0
Modules:
[]
Connections:
[]
Recurrent Connections:
[]

We can see a new connection called Recurrent Connections for the recurrent network. Right now there is no data available.

Let us now create the layers and add to modules and create connections.

Adding Modules and Connection

We are going to create layers, i.e., input, hidden and output. The layers will be added to the input and output module. Next, we will create the connection for input to hidden, hidden to output and a recurrent connection between hidden to hidden.

Here is the code for the Recurrent network with modules and connections.

rn.py

from pybrain.structure import RecurrentNetwork
from pybrain.structure import LinearLayer, SigmoidLayer
from pybrain.structure import FullConnection
recurrentn = RecurrentNetwork()

#creating layer for input => 2 , hidden=> 3 and output=>1
inputLayer = LinearLayer(2, 'rn_in')
hiddenLayer = SigmoidLayer(3, 'rn_hidden')
outputLayer = LinearLayer(1, 'rn_output')

#adding the layer to feedforward network
recurrentn.addInputModule(inputLayer)
recurrentn.addModule(hiddenLayer)
recurrentn.addOutputModule(outputLayer)

#Create connection between input ,hidden and output
input_to_hidden = FullConnection(inputLayer, hiddenLayer)
hidden_to_output = FullConnection(hiddenLayer, outputLayer)
hidden_to_hidden = FullConnection(hiddenLayer, hiddenLayer)

#add connection to the network
recurrentn.addConnection(input_to_hidden)
recurrentn.addConnection(hidden_to_output)
recurrentn.addRecurrentConnection(hidden_to_hidden)
recurrentn.sortModules()

print(recurrentn)

python rn.py

C:\pybrain\pybrain\src>python rn.py
RecurrentNetwork-6
Modules:
[<LinearLayer 'rn_in'>, <SigmoidLayer 'rn_hidden'>, 
   <LinearLayer 'rn_output'>]
Connections:
[<FullConnection 'FullConnection-4': 'rn_hidden' -> 'rn_output'>, 
   <FullConnection 'FullConnection-5': 'rn_in' -> 'rn_hidden'>]
Recurrent Connections:
[<FullConnection 'FullConnection-3': 'rn_hidden' -> 'rn_hidden'>]

In above ouput we can see the Modules, Connections and Recurrent Connections.

Let us now activate the network using activate method as shown below −

rn.py

Add below code to the one created earlier −

#activate network using activate() method
act1 = recurrentn.activate((2, 2))
print(act1)

act2 = recurrentn.activate((2, 2))
print(act2)

python rn.py

C:\pybrain\pybrain\src>python rn.py
[-1.24317586]
[-0.54117783]

Next Topic : Click Here

This Post Has 2 Comments

  1. Great, thanks for sharing this blog article.Really looking forward to read more. Much obliged.

Leave a Reply