Reputation: 473
I am relatively new to tensorflow and want to use the DNNRegressor from tf.contrib.learn for a regression task. But instead of one output node, I would like to have several (let's say ten for example).
How can I configure my regressor to adjust many output nodes to fit my needs?
My question is related to the following ones already asked on SO, but there seems to be no working answer (I am using TensorFlow version 0.11)
skflow regression predict multiple values
Multiple target columns with SkFlow TensorFlowDNNRegressor
Upvotes: 8
Views: 2130
Reputation: 104
Using tflearn this works:
net = tfl.input_data(shape=[None, n_features1, n_features2], name='input')
net = tfl.fully_connected(net, 128, activation='relu')
net = tfl.fully_connected(net, n_features, activation='linear')
net = tfl.regression(net, batch_size=batch_size, loss='mean_square', name='target')
Replace the single fully connected layer of 128 nodes here with whatever network architecture you want. And don't forget to choose the loss function appropriate to your problem, e.g., cross-entropy for classification.
python 2.7.11, tensorflow 0.10.0rc0, tflearn 0.2.1
Upvotes: 0
Reputation: 1920
It seems using tflearn will be the other choice.
Update: I realize we should use Keras as an well developed API for tensorflow+ theano .
Upvotes: 1