Reputation: 75
I cannot find proper documentation on how to use TensorFlow v2 for basic dataflow programming. I can find many resources online about TensorFlow v1 but much of the behaviour they explain is now deprecated. For instance, the following python code works fine in TensorFlow v1:
import tensorflow as tf
# Define some helper functions
def func1(a, b):
return a+b # Or any manipulation of the arguments
def func2(a, b):
return a*b
def func3(a, b):
return a-b
# Define a graph; (a,b)-->e, (c,d)-->f, (e,f)-->g
a = tf.placeholder(tf.float64)
b = tf.placeholder(tf.float64)
c = tf.placeholder(tf.float64)
d = tf.placeholder(tf.float64)
e = tf.py_func(func1, [a,b], tf.float64)
f = tf.py_func(func2, [c,d], tf.float64)
g = tf.py_func(func3, [e,f], tf.float64)
# Execute in a session
sess1 = tf.Session()
res = sess1.run([g], feed_dict={a:1, b:2, c:3, d:4})
print(res) # = [-9.0]
Here I have converted some python functions into tensorflow operations; I have defined a graph in which g
depends on e
and f
which in turn depend on a,b,c,d
; finally, I have executed a session and provided inputs for a,b,c,d
.
I can also avoid to provide all inputs and choose intermediate nodes instead:
sess2 = tf.Session()
res = sess2.run([g], feed_dict={a:1, b:2, f:12})
print(res) # = [-9.0]
Here, I already provide f
, instead of its dependencies c,d
.
How would something like this work in TensorFlow v2 (besides importing tensorflow.compat.v1
)? Is TensorFlow v2 even thought to do something like this or is it just machine learning?
Note: This is somehow connected to this question which deals with a much more complicated problem.
Upvotes: 1
Views: 148
Reputation: 678
The main motivation, as far as I understand, of TF 2.0 was to get away from the old style of first defining all the dataflow and then calling it at the end. They wanted to switch to a more pythonic way of doing things.
An important part of this is to define functions in a pythonic way, and then decorate them with @tf.function
to have tensorflow generate a graph and improve performance.
The "kitchen sink" approach of putting everything into a large graph and then running a session for what you need from it, is discouraged in TF2.
In your example this would look like
import tensorflow as tf
# Define some helper functions
def func1(a, b):
return a+b # Or any manipulation of the arguments
def func2(a, b):
return a*b
def func3(a, b):
return a-b
If you only need one large function, this should be sufficient:
# define combined tf function, the decorated ensures a graph is generated.
@tf.function
def tf2_function(a, b, c, d):
e = func1(a, b)
f = func2(c, d)
g = func3(e, f)
return g
e, f, g = tf2_function(1, 2, 3, 4) # -9
But if you need the option to get intermediate values, or provide intermediate values, then the best option would be to split the function into two smaller functions, as is recommended by the TF-developers.
@tf.function
def get_f(a,b,c,d):
e = func1(a, b)
f = func2(c, d)
return f
@tf.function
def get_g_from_e_f(e,f):
g = func3(e,f)
return g
You also don't need to decorate every function with @tf.function
, it is usually enough to decorate the larger, more expensive functions with it, and all functions called buy them will be converted into a graph as well.
But yes, there is no real way to do what you could do in TF1.x, with feeding different values and getting different outputs from the graph, in TF2.
Upvotes: 1