Achille
Achille

Reputation: 51

Keras / Tensorflow : "You must feed a value for placeholder tensor 'input_1' with dtype float and shape [?, 600, 451, 3]"

I have this CNN I'm working on. Input shape is dynamic, but I fixed it to [?, 600, 451, 3] (batch_size, height, width, channels) so that I can debug it.

I have a random batch generator I created:

test = random_batch_generator(z_train
                    , num_processes=12 
                    , num_batch=steps_train 
                    , preloaded_batch=100
                    , batch_size=batch_size
                    , chunk_size=batch_size
                    , dataaugmfunc=heavy_dataaugm
                    , seq=seq
                    , initial_dim=initial_dim
                    , min_overlap=MINOVERLAP
                    )

When I do:

next(test)[0].shape

or

next(test)[0].dtype

it outputs me the correct shape ([?, 600, 451, 3]) and dtype (float32), which is in theory required for my input. I also checked the content of the batches, it seems good.

Still, I got, when I train my model with the following:

model.fit_generator(
        random_batch_generator(z_train (...)),
        validation_data= (x_val_mem,y_val_mem),
        steps_per_epoch=steps_train,
        validation_steps=steps_val,
        epochs=epochs
        ,callbacks=model_callbacks(modelname)
        ,class_weight = [0.005,0.995]
    )

this error message:

InvalidArgumentError (see above for traceback): You must feed a value for placeholder tensor 'input_1' with dtype float and shape [?,600,451,3]

[[Node: input_1 = Placeholderdtype=DT_FLOAT, shape=[?,600,451,3], _device="/job:localhost/replica:0/task:0/device:GPU:0"]]

What am I doing wrong? Thanks a thousand for any help or intuition on this.

Upvotes: 5

Views: 8188

Answers (4)

Timbus Calin
Timbus Calin

Reputation: 15023

This happened to me (TF 1.14) when I set 'histogram_freq = 1' instead of 0.

Upvotes: 0

KelishaZ
KelishaZ

Reputation: 101

  • most of the time this problem occures because of unused inputes(on your fit generator) are added to your network. try to avoid or put a comment on unused inputes in your network and try it again.if number of inputs for the model and number of inputs to batch generator or fit() function are not balanced this problem will happen.

before all you have to reset your session *

import keras.backend as K
K.clear_session()

Upvotes: 2

pdpino
pdpino

Reputation: 494

Are you using a TensorBoard callback? If so, you could try adding this before creating the model

import keras.backend as K
K.clear_session()

See this answer

Upvotes: 6

Daniel Möller
Daniel Möller

Reputation: 86610

Not sure this is the cause, but something is not compatible with the validation data.

If you have the validation data as arrays, you pass it as validation_data=(array_x, array_y), and there aren't validation_steps.

Now, if it's a generator, then you need to pass it as validation_data = someGenerator, then you pass validation_steps=number_of_batches_expected_from_generator.

Upvotes: 1

Related Questions