Kaede
Kaede

Reputation: 63

create a tensor proto whose content is larger than 2GB

I created a ndarray (W) which size is (2^22, 256), and I tried to use this array as my initialization of weight matirx using:

w = tf.Variable(tf.convert_to_tensor(W))

then, the tensorflow raised a error: ValueError: Cannot create a tensor proto whose content is larger than 2GB.

How can I fix this problem? PS. my weight matrix must using that (2^22, 256) matrix for initializing. THX :)

Upvotes: 6

Views: 13072

Answers (2)

mindis
mindis

Reputation: 104

for tf v1.14.0 you can solve this with tf.compat.v1.enable_eager_execution() tf v2.0+ doesn't throw error in situation at all.

Upvotes: -2

Patwie
Patwie

Reputation: 4450

Protobuf has a hard limit of 2GB. And 2^22*256 floats are 4GB. Your problem is, that you are going to embed the initial value into the graph-proto by

import tensorflow as tf
import numpy as np

w_init = np.random.randn(2**22, 256).astype(np.float32)
w = tf.Variable(tf.convert_to_tensor(w_init))
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print sess.run(tf.reduce_sum(w))

causing

ValueError: Cannot create a tensor proto whose content is larger than 2GB.

This graph definition above is basically saying: "The graph has a variable occupying 4GB and here are the exact values: ..."

Instead, you should write

import tensorflow as tf
import numpy as np

w_init = np.random.randn(2**22, 256).astype(np.float32)
w_plhdr = tf.placeholder(dtype=tf.float32, shape=[2**22, 256])
w = tf.get_variable('w', [2**22, 256])
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    sess.run(w.assign(w_plhdr), {w_plhdr: w_init})
    print sess.run(tf.reduce_sum(w))

This way, your variable holds 4GB of value but the graph only has the knowledge: "Hey, there is a variable of size 4 GB. Just don't care about the exact values within the graph definition. Because there is an operation to overwrite these values anyway later.".

Upvotes: 9

Related Questions