Bruno Penha
Bruno Penha

Reputation: 41

Elman Network in Pybrain

I'm trying to make an Elman Network (aka Simple Recurent Network) with Pybrain, I think the code should look something like this:

n = RecurentNetwork()
n.addInputModule(LinearLayer(5, name = 'in'))
n.addModule(TanhLayer(10, name = 'hidden'))
n.addModule(LinearLayer(10, name = 'context'))
n.addOutputModule(LinearLayer(5, name = 'out'))
n.addConnection(FullConnection(n['in'], n['hidden'], name = 'in_to_hidden'))
n.addConnection(FullConnection(n['hidden'], n['out'], name = 'hidden_to_out'))
n.addConnection(IdentityConnection(n['hidden'], n['context'], name = 'hidden_to_context'))
n.addConnection(IdentityConnection(n['context'], n['hidden'], name = 'context_to_hidden')) 

My problem is that I don't know how to get the context nodes (at time t) to keep the values of the hidden nodes of the last iteration (at time t-1) in order to give them to the hidden nodes in this iteration (at time t) and how to fix the weights in hidden_to_context to be 1. How it is right now I get an error saying there is a "loop" in the net (and indeed there is one). Any help would be much appreciated. Thank you very much.

Cheers,

Bruno

Upvotes: 1

Views: 1141

Answers (1)

rossdavidh
rossdavidh

Reputation: 1996

I would look at this section:

http://pybrain.org/docs/tutorial/netmodcon.html#using-recurrent-networks

In particular,

The RecurrentNetwork class has one additional method, .addRecurrentConnection(), which looks back in time one timestep.

Upvotes: 2

Related Questions