Kong
Kong

Reputation: 2410

Tensorflow processing with extremely big data

I have a large image file that I am trying to load and use with tensorflow (600000,20,64,64). I wanted to use feed_dict with a batch size but the array is too large to even load into a numpy array. Is there a method that I can use that will not require me to load the entire array in ?

Upvotes: 1

Views: 183

Answers (1)

freude
freude

Reputation: 3832

The "feed_dict" method is good only for small data. For large data, there is a tfrecord or csv formats which may be read in a separate thread by chunks which populate a tensorflow input data queue. More info on data reading from HDD is here - how to read data in tensorflow.

There is also a nice blog where it is shown how to prepare tfrecords from raw data.

Upvotes: 2

Related Questions