Siemkowski
Siemkowski

Reputation: 1431

Calculating neural network with arbitrary topology

Reading about neural evolution in some ppt presentation I came across a phrase:

network output is calculated the standard way

I successfully implemented a simple feedforward mechanism following some guides (using vector representation of weights - 1, 2, 3) and I understand (more or less) how recurrent networks could be calculated.

What I couldn't find is how would a neural network with arbitrary topology be calculated. Is there any 'standard way' (algorithm)?

I imagine one way (assuming feedforward topology), though very time consuming, would be to loop through all neurons until output is calculated.

I imagine another method could be organizing arbitrary topology into layers (also assuming feedforward topology - this?) and then calculating it.

QUESTIONS

What is the 'standard way' to calculate arbitrary topology network output? / How to calculate arbitrary topology network output?

ASSUMPTIONS

  1. A feedforward topology (recurrent topology as a bonus, probably much more complicated).
  2. Bias node present.

PS. I'm working with Python, following NEAT paper.

Upvotes: 3

Views: 835

Answers (1)

Maxim
Maxim

Reputation: 53758

Neural networks can't have truly arbitrary topology, there are certain restrictions:

  • The topology must be (reducible to) a directed acyclic graph (DAG). You might ask whether RNNs contradict this requirement: they don't, because every RNN can be unrolled into a DAG. There are other cases, when cycles can be emulated in the network, but the way it's executed the network can be always presented as a DAG and the backpropagation is limited.
  • The graph must have dedicated input and output nodes, such that no input can depend on the output. Input nodes typically provide the training data.
  • There are also other restrictions, e.g. activation functions must be differentiable.

Now you can notice that these networks are very like feed-forward: the forward pass runs from the inputs to the outputs, the backward pass runs in the other direction. This is possible because a DAG can be sorted topologically. In fact, topological sort simply is the representation of a graph in a feed-forward manner.

As for cycle emulation, the number of iterations is always limited, because memory is limited. A network is effectively a DAG, with inputs, outputs and some repeated pattern in between, which can also be seen as feed-forward.

To summarize: the same mechanism that is used for simple neural networks is done for all networks (the standard if you like), but the network representation can sometimes look different.

Upvotes: 3

Related Questions