Nischal Karthik
Nischal Karthik

Reputation: 1

Calculating jacobians and gradients using tensor flow

I'm trying to solve 2D Darcy equation which is a mixed formulation. Suppose I have a target vector and source vector as follows:

u = [u1,u2,p] 
x = [x,y]. 
grad(u,x) =
[du1/dx, du2/dx, dp/dx;
 du1/dy, du2/dy, dp/dy]

I'm not understanding if this is what happens if I do tf.gradients(u,x).

Upvotes: 0

Views: 70

Answers (1)

Laplace Ricky
Laplace Ricky

Reputation: 1687

tf.gradients(u,x) doesn't return what you want because

from https://www.tensorflow.org/api_docs/python/tf/gradients,

gradients() adds ops to the graph to output the derivatives of ys with respect to xs. It returns a list of Tensor of length len(xs) where each tensor is the sum(dy/dx) for y in ys and for x in xs.

Here is how you can get jacobian.

import tensorflow as tf

x=tf.constant([3.0,4.0])

with tf.GradientTape() as tape:
  tape.watch(x)
  u1=x[0]**2+x[1]**2
  u2=x[0]**2
  u3=x[1]**3
  u=tf.stack([u1,u2,u3])

J = tape.jacobian(u, x)
print(J)
'''
tf.Tensor(
[[ 6.  8.]
 [ 6.  0.]
 [ 0. 48.]], shape=(3, 2), dtype=float32)
'''

Upvotes: 1

Related Questions