Kay Jersch
Kay Jersch

Reputation: 297

tf.gradients only returns [None]

I tried to build my own deep dream algorithm with this code using the Inception Neural Network from Google:

import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np
#I am using the Kadenze CADL helper function ---> https://github.com/pkmital/CADL/tree/master/session-4/libs
import inception

img = np.random.rand(1,1920,1080,3)

net = inception.get_inception_model()
tf.import_graph_def(net['graph_def'], name='inception')
graph = tf.get_default_graph()
layer = graph.get_tensor_by_name('inception/mixed5b_pool_reduce_pre_relu:0')
gradient = tf.gradients(tf.reduce_mean(layer), img)
sess = tf.Session()
init = tf.global_variables_initializer()
iters = 1440

sess.run(init)
for i in range(iters):
    print(i+1)
    grad = sess.run(gradient[0])[0]
    img += grad
    plt.imshow(img[0])
    plt.savefig('output/'+str(i+1)+'.png')
    plt.close('all')

But the line tf.gradients(tf.reduce_mean(layer), img) only returns [None]. This (of course) causes an error. Can anyone tell me how to fix it?

Upvotes: 1

Views: 1100

Answers (0)

Related Questions