batuman
batuman

Reputation: 7304

Why shape is changed in feeding to Tensorflow graph for Batch processing

The graph needs to process in batch and input shapes are defined as (None,60,80,19) and (None,60,80,38).

The TensorFlow graph definition is as follows:

def __init__(self, tf_config=None):
     self.tensor_heatMat = tf.placeholder(
         dtype=tf.float32, shape=(None, 60, 80, 19), name='heatMat_in')
     self.tensor_pafMat = tf.placeholder(
         dtype=tf.float32, shape=(None, 60, 80, 38), name='pafMat_in')
     self.upsample_size = tf.placeholder(
         dtype=tf.int32, shape=(2,), name='upsample_size')
     self.tensor_heatMat_up = tf.image.resize_area(
         self.tensor_heatMat, self.upsample_size, align_corners=False, name='upsample_heatmat')
     self.tensor_pafMat_up = tf.image.resize_area(
         self.tensor_pafMat, self.upsample_size, align_corners=False, name='upsample_pafmat')
     smoother = Smoother({'data': self.tensor_heatMat_up}, 25, 3.0)
     gaussian_heatMat = smoother.get_output()
     max_pooled_in_tensor = tf.nn.pool(gaussian_heatMat, window_shape=(
         3, 3), pooling_type='MAX', padding='SAME')
     self.tensor_peaks = tf.where(tf.equal(
         gaussian_heatMat, max_pooled_in_tensor), gaussian_heatMat, tf.zeros_like(gaussian_heatMat))
     self.heatMat = self.pafMat = None
     self.persistent_sess = tf.InteractiveSession()
     self.persistent_sess.run(tf.variables_initializer(
        [v for v in tf.global_variables() if
         v.name.split(':')[0] in [x.decode('utf-8') for x in
                                  self.persistent_sess.run(tf.report_uninitialized_variables())]
         ])
     )


def inference(self, heatmat, pafmat, upsample_size=4.0):
    peaks, heatMat_up, pafMat_up = self.persistent_sess.run(
        [self.tensor_peaks, self.tensor_heatMat_up, self.tensor_pafMat_up], feed_dict={
            self.tensor_heatMat: [heatmat], self.tensor_pafMat: [pafmat], self.upsample_size: (240, 320)
        })
    peaks = peaks[0]
    self.heatMat = heatMat_up[0]
    self.pafMat = pafMat_up[0
    humans = PoseEstimator.estimate_paf(peaks, self.heatMat, self.pafMat)
    return humans

So self.tensor_heatMat and self.tensor_pafMat require batched tensors.

My input data to those placeholders are:

outputs = outputs.reshape(32, 60, 80, 57)
heat_maps = outputs[:, :, :, : 19]
puf_maps = outputs[:, :, :, 19:]     
humans = inference(heat_maps, puf_maps,4.0)

heat_maps and puf_maps shapes are (32, 60, 80, 19) and (32, 60, 80, 38). But when I run the session with input tensors, there is an error:

ValueError: Cannot feed value of shape (1, 32, 60, 80, 19) for Tensor 'heatMat_in:0', which has shape '(?, 60, 80, 19)'

What could be the issue?

Upvotes: 0

Views: 100

Answers (1)

Jindřich
Jindřich

Reputation: 11240

The error is in the feed_dict. If you put the tensors in lists, TensorFlow interprets the list as the first dimension of the tensor. This is where the same (1, 32, ...) comes from. You should do instead

feed_dict={
    self.tensor_heatMat: heatmat,
    self.tensor_pafMat: pafmat,
    self.upsample_size:(240,320)}

The first dimension will be 32 which is the variable batch that you leave as None in the model init method.

Upvotes: 1

Related Questions