Wooni
Wooni

Reputation: 501

Simple neural network in tensorflow -> shape problem

I have two data list (x1, y1), (x2, y2) ...

I do not know the equation between x and y. So, I tried to use neural network to find it.

The hyperbolic.txt file has (x1, y1), (x2, y2) ...

The codes are below, but it does not work.

ValueError: Cannot feed value of shape (30,) for Tensor 'Placeholder:0', which has shape '(?, 1)'

I guess the shape of np_poses_x, np_poses_y might be wrong, but I cannot come up with how to change it.

import tensorflow as tf
import numpy as np
from tqdm import tqdm
import random

class datasource(object):
    def __init__(self, xx, yy):
        self.x = xx
        self.y = yy

def get_data(directory, dataset):
    xx = []
    yy = []

    with open(directory+dataset) as f:
        for line in f:
            p0,p1 = line.split()
            p0 = float(p0)
            p1 = float(p1)
            xx.append((p0))
            yy.append((p1))

    return datasource(xx, yy)


def gen_data(source):
    while True:
        indices = list(range(len(source.x)))
        random.shuffle(indices)
        for i in indices:
            yval = source.x[i]
            xval = source.y[i]
            yield xval, yval

def gen_data_batch(source, batch_size):
    data_gen = gen_data(source)
    while True:
        x_batch = []
        y_batch = []

        for _ in range(batch_size):
            _x, _y = next(data_gen)
            x_batch.append(_x)
            y_batch.append(_y)

        yield np.array(x_batch), np.array(y_batch)

X1 = tf.placeholder(tf.float32, shape=[None, 1])
Y = tf.placeholder(tf.float32, shape=[None, 1])

W1 = tf.Variable(tf.random_normal([1, 50], stddev=0.01))
L1 = tf.nn.relu(tf.matmul(X1, W1))

W2 = tf.Variable(tf.random_normal([50, 256], stddev=0.01))
L2 = tf.nn.relu(tf.matmul(L1, W2))

W3 = tf.Variable(tf.random_normal([256, 1], stddev=0.01))
model = tf.matmul(L2, W3)

cost = tf.reduce_mean(tf.square(model-Y))
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(cost)

sess = tf.Session()
sess.run(tf.global_variables_initializer())

datasource = get_data('', 'hyperbolic.txt')

max_iterations = 100000
batch = 30
data_gen = gen_data_batch(datasource, batch)
for i in range(max_iterations):
    np_poses_x, np_poses_y = next(data_gen)
    feed = {X1: np_poses_x, model: np_poses_y}
    sess.run(optimizer, feed_dict=feed)
    np_loss = sess.run(cost, feed_dict=feed)

Upvotes: 2

Views: 99

Answers (1)

NiziL
NiziL

Reputation: 5140

You got it right, you need to feed your network a (N,1) tensor and not a (N,) tensor.

The easiest solution might be to add the new dimension on the numpy side, using either np.newaxis (which is None) or the np.reshape function.

So you can apply this in gen_data_batch, replacing yield np.array(x_batch), np.array(y_batch) by yield np.array(x_batch)[:, np.newaxis], np.array(y_batch)[:, np.newaxis] for example.

You can also add this new axis on np_poses_x and np_poses_y :

feed = {X1: np_poses_x.reshape((len(np_poses_x), 1)),
        model: np_poses_y.reshape((len(np_poses_y), 1))}

Upvotes: 1

Related Questions