Reputation: 1431
Reading about neural evolution in some ppt
presentation I came across a phrase:
network output is calculated the standard way
I successfully implemented a simple feedforward mechanism following some guides (using vector representation of weights - 1, 2, 3) and I understand (more or less) how recurrent networks could be calculated.
What I couldn't find is how would a neural network with arbitrary topology be calculated. Is there any 'standard way' (algorithm)?
I imagine one way (assuming feedforward topology), though very time consuming, would be to loop through all neurons until output is calculated.
I imagine another method could be organizing arbitrary topology into layers (also assuming feedforward topology - this?) and then calculating it.
QUESTIONS
What is the 'standard way' to calculate arbitrary topology network output? / How to calculate arbitrary topology network output?
ASSUMPTIONS
PS. I'm working with Python
, following NEAT paper.
Upvotes: 3
Views: 835
Reputation: 53758
Neural networks can't have truly arbitrary topology, there are certain restrictions:
Now you can notice that these networks are very like feed-forward: the forward pass runs from the inputs to the outputs, the backward pass runs in the other direction. This is possible because a DAG can be sorted topologically. In fact, topological sort simply is the representation of a graph in a feed-forward manner.
As for cycle emulation, the number of iterations is always limited, because memory is limited. A network is effectively a DAG, with inputs, outputs and some repeated pattern in between, which can also be seen as feed-forward.
To summarize: the same mechanism that is used for simple neural networks is done for all networks (the standard if you like), but the network representation can sometimes look different.
Upvotes: 3