Reputation: 4357
Below is my code for a neural network,with 3 inputs and 1 hidden layer and 1 output:
#Data
ds = SupervisedDataSet(3,1)
myfile = open('my_file.csv','r')
for data in tf.myfile ():
indata = tuple(data[:3])
outdata = tuple(data[3])
ds.addSample(indata,outdata)
net = FeedForwardNetwork()
inp = LinearLayer(3)
h1 = SigmoidLayer(1)
outp = LinearLayer(1)
# add modules
net.addOutputModule(outp)
net.addInputModule(inp)
net.addModule(h1)
# create connections
net.addConnection(FullConnection(inp, h1))
net.addConnection(FullConnection(h1, outp))
# finish up
net.sortModules()
# initialize the backprop trainer and train
trainer = BackpropTrainer(net, ds)
trainer.trainOnDataset(ds,1000) trainer.testOnData(verbose=True)
print 'Final weights:',net.params
My question is,if you want to use this trained neural network to make a forecast based on specific inputs,how do you do it?
Upvotes: 2
Views: 3817
Reputation: 3497
If I understand you correctly, your data has a time order. What I do for making forecast is to shift the data table, in order to present the next output as target for the training. For example, if you have this kind of data:
w1 x1 y1 z1
w2 x2 y2 z2
w3 x3 y3 z3
w4 x4 y4 z4
. . .
and you want to predict z2, you construct a table like:
w1 x1 y1 z1 | z2
w2 x2 y2 z2 | z3
w3 x3 y3 z3 | z4
. . .
Then you present the last column as target for the training. Of course, you lose one line at the end of your table.
You can also improve the output by giving the difference between steps as an additional input (gives you the dynamical effect:
w2 x2 y2 z2 (w2-w1) (z2-z1) | z3
w3 x3 y3 z3 (w3-w2) (z3-z2) | z4
. . .
Upvotes: 0
Reputation: 1690
According to the documentation, you can test specific inputs with the activate
method on your network. Assuming your input looks something like (1 2 3) your code would look like
net.activate((1,2,3))
Upvotes: 4