Ceveloper
Ceveloper

Reputation: 83

Does tensorflow c++ API support automatic differentiation for backpropagation?

Does tensor-flow C++ API support automatic differentiation to back-propagate the gradient?
If I write a graph in c++ and would like to run it in a c++ code (not in python!) will automatic differentiation work?

Let's suppose every op in the graph has a gradient implementation.

I think the documentation regarding what tensor-flow C++ API can and can't do is is very poor.

Thank you very much for the help

Upvotes: 2

Views: 1322

Answers (1)

javidcf
javidcf

Reputation: 59731

Technically it can, but AFAIK the automatic differentiation is only "configured" in Python. What I mean by this is that, at a lower level, each TensorFlow operation does not declare itself what its gradient is (that is, the corresponding operation that computes its gradient). That is instead declared at Python level. For example, you can take a look at math_ops.py. You will see that, among other things, there are several functions decorated with @ops.RegisterGradient(...). What this decorator does is adding that function to a global registry (in Python) of operations and their gradients. So, for example, optimizer classes are largely implemented in Python, since they make use of this registry to build the backpropagation computation (as opposed to making use of native TensorFlow primitives to that end, which do not exist).

So the point is that you can do the same computations using the same ops (which are then implemented with the same kernels), but I don't think that C++ has (or will ever have) such gradient registry (and optimizer classes), so you would need to work out or copy that backpropagation construction by yourself. In general, the C++ API is not well suited to building the computation graph.

Now a different question (and maybe this was what you were asking about in the first place) is whether you can run an already existing graph that does backpropagation in C++. By this I mean building a computation graph in Python, creating an optimizer (which in turn creates the necessary operations in the graph to compute the gradient and update the variables) and exporting the graph, then load that graph in C++ and run it. That is entirely possible and no different to running any other kind of thing in TensorFlow C++.

Upvotes: 3

Related Questions