AAA
AAA

Reputation: 305

parameters value in tensorflow

I am new in tensorflow I try to import SVHN dataset to code that represented in this CNN tutorial The code reads a cifar10 dataset as binary and I want to replace it with SVHN dataset as png images

I changed the layers and the read data steps. Also I resize all the input images after reading them to [32,32]

batch_size = 128

the problem is that when I try to train it, it gives me an error in input data step::

the subcode is shown bellow:::

  label_bytes = 1  # 2 for CIFAR-100
  result.height = 32
  result.width = 32
  result.depth = 3
  image_bytes = result.height * result.width * result.depth
  # Every record consists of a label followed by the image, with a
  # fixed number of bytes for each.
  record_bytes = label_bytes + image_bytes

  # Read a record, getting filenames from the filename_queue.  No
  # header or footer in the CIFAR-10 format, so we leave header_bytes
  # and footer_bytes at their default of 0.



  reader = tf.WholeFileReader()

  #for binar format (cifar daatset)
  ###reader = tf.FixedLengthRecordReader(record_bytes=record_bytes) ##using for binary (.bin) format
  ###reader = tf.TextLineReader() #this for scv formate and I used for .mat format
  result.key, value = reader.read(filename_queue)


  # Convert from a string to a vector of uint8 that is record_bytes long.
  ###record_bytes = tf.decode_raw(value, tf.uint8) ## for .bin formate
  record_bytes = tf.image.decode_png(value)
  result.uint8image = record_bytes
  result.uint8image = tf.image.resize_images(result.uint8image, [32,32])

  # The first bytes represent the label, which we convert from uint8->int32.
  result.label = tf.cast(
      tf.strided_slice(record_bytes, [0], [label_bytes]), tf.int32)

  # The remaining bytes after the label represent the image, which we reshape
  # from [depth * height * width] to [depth, height, width].

  depth_major = tf.reshape(
      tf.strided_slice(record_bytes, [label_bytes],
                       [label_bytes + image_bytes]),
      [result.depth, result.height, result.width])
  # Convert from [depth, height, width] to [height, width, depth].
  result.uint8image = tf.transpose(depth_major, [1, 2, 0])

the error is shown bellow too ::

File "/home/Desktop/SVHN/cifar10_input.py", line 111, in read_cifar10 [result.depth, result.height, result.width])

InvalidArgumentError (see above for traceback): Input to reshape is a tensor with 44856 values, but the requested shape has 3072

I have two questions::

1) I want an explanation of this error, because I can not understand it, and how can I solve it.

2) is there any good tutorial that explain how to choice a good CNN parameters value

Upvotes: 1

Views: 263

Answers (1)

iga
iga

Reputation: 3633

  1. The cause of the error is that you cannot reshape a tensor with 44856 values into a tensor with 32*32*3 (=3072) values. Reshaping is an operation that simply changes the shape of the tensor without adding or removing any values. In your case, tf.strided_slice somehow produced a large tensor (with 44856 values) and you can't reshape it into a 32*32*3 tensor. I don't see how that can happen without complete code and example file.
  2. This question is beyond StackOverflow and is probably too general to have a reasonable answer.

Also, I noticed that you are trying to extract the label from the first byte of your record_bytes. This seems wrong since record_bytes in your case is a decoded png, not a special cifar record that encodes the label for the image in its first byte.

Upvotes: 1

Related Questions