John Boersma
John Boersma

Reputation: 33

Matrix value not needed for Lop?

In the theano derivatives tutorial here:

http://deeplearning.net/software/theano/tutorial/gradients.html#tutcomputinggrads

the example of Lop works without an explicit value of the W matrix in the dot product. And, in fact, the partial derivatives in this case do remove the values of the components of W so they are not needed.

But, attempting a similar thing with the Rop throws an error:

theano.gof.fg.MissingInputError: ("An input of the graph, used to compute dot(Elemwise{second,no_inplace}.0, ), was not provided and not given a value.

How is this different?

Upvotes: 1

Views: 122

Answers (1)

dontloo
dontloo

Reputation: 10865

Theano will try to optimize the computation graph, but it does not always work.

In the Lop example, Theano can detect that we don't actually need that W, but when changed to the Rop it just can't.

The Lop example:

W = T.dmatrix('W')
v = T.dvector('v')
x = T.dvector('x')
y = T.dot(x, W)
VJ = T.Lop(y, W, v)
f = theano.function([v, x], VJ)
f([2, 2], [0, 1])

If I just change y = T.dot(x, W) to y = T.dot(x, W**1), Theano will fail to do the optimization and throw the same error message at me say that I did not provide enough parameters.

Actually in the Rop example, if we change the values given to W, it does not affect the result at all, because Theano failed to optimize that.

p.s. I find the Theano documents very unclear sometimes.

Upvotes: 0

Related Questions