جولائی 17, 2018 —
Posted by Alex Wiltschko, Dan Moldovan, Wolff Dobson
We’d like to tell you about a new TensorFlow feature called “AutoGraph”. AutoGraph converts Python code, including control flow, print() and other Python-native features, into pure TensorFlow graph code.
Writing TensorFlow code without using eager execution requires you to do a little metaprogramming — -you write a program that creates a graph,…
print()
and other Python-native features, into pure TensorFlow graph code.if
and while
, or ones that have side effects like print()
, or accept structured input.def huber_loss(a):
if tf.abs(a) <= delta:
loss = a * a / 2
else:
loss = delta * (tf.abs(a) - delta / 2)
return loss
With eager execution, this would “just work”, however such operations may be slow due to Python interpreter overheads or missed program optimization opportunities.tf.cond()
, but that can be tedious and difficult to implement. AutoGraph can do this conversion automatically for you, keeping the ease of eager programming while reaping the performance benefit of graph-based execution.autograph.convert()
, and AutoGraph will automatically generate graph-ready code.@autograph.convert()
def huber_loss(a):
if tf.abs(a) <= delta:
loss = a * a / 2
else:
loss = delta * (tf.abs(a) - delta / 2)
return loss
becomes this code at execution time due to the decorator.def tf__huber_loss(a):
with tf.name_scope('huber_loss'):
def if_true():
with tf.name_scope('if_true'):
loss = a * a / 2
return loss,
def if_false():
with tf.name_scope('if_false'):
loss = delta * (tf.abs(a) - delta / 2)
return loss,
loss = ag__.utils.run_cond(tf.less_equal(tf.abs(a), delta), if_true,
if_false)
return loss
You can then call your code as if it were a TensorFlow op:with tf.Graph().as_default():
x_tensor = tf.constant(9.0)
# The converted function works like a regular op: tensors in, tensors out.
huber_loss_tensor = huber_loss(x_tensor)
with tf.Session() as sess:
print('TensorFlow result: %2.2f\n' % sess.run(huber_loss_tensor))
As you can see, AutoGraph bridges the gap between eager execution and Graphs. AutoGraph takes in your eager-style Python code and converts it to graph-generating code..to_graph()
function to turn this into a graph.def collatz(a):
counter = 0
while a != 1:
if a % 2 == 0:
a = a // 2
else:
a = 3 * a + 1
counter = counter + 1
return counter
graph_mode_collatz = autograph.to_graph(collatz)
# The code is human-readable, too
print(autograph.to_code(collatz))
collatz_tensor = graph_mode_collatz(tf.constant(n))
AutoGraph can support arbitrary nested control flow, such as:def f(n):
if n >= 0:
while n < 5:
n += 1
print(n)
return n
AutoGraph allows you to append elements to arrays inside loops. To make this work, we use some AutoGraph helpers, set_element_type
and stack
.def f(n):
z = []
# We ask you to tell us the element dtype of the list
autograph.set_element_type(z, tf.int32)
for i in range(n):
z.append(i)
# when you're done with the list, stack it
# (this is just like np.stack)
return autograph.stack(z)
We also support constructs like break
, continue
, and even print
and assert
. When converted, this snippet’s Python assert
converts to a graph that uses the appropriate tf.Assert
.def f(x):
assert x != 0, 'Do not pass zero!'
return x * x
Having the ability to easily add loops, control flow, and more to graphs means that it’s easy to move the training loop into the graph. An example of this can be found in this notebook where we take an RNN training loop and execute it with a single sess.run()
call. This could be useful in situations where you need to pass an entire training loop to an accelerator, rather than manage training via a CPU controller.if
and while
.tf.contrib.eager.defun
. This requires you to use graph TensorFlow ops like tf.cond()
. In the future, AutoGraph will be seamlessly integrated with defun
to allow authoring graph code in plain eager-style Python. When that implementation is available, you can expect to use AutoGraph to speed up hotspots by selectively turning eager code into graph fragments.contrib
, but we expect to move it into core TensorFlow soon.
جولائی 17, 2018
—
Posted by Alex Wiltschko, Dan Moldovan, Wolff Dobson
We’d like to tell you about a new TensorFlow feature called “AutoGraph”. AutoGraph converts Python code, including control flow, print() and other Python-native features, into pure TensorFlow graph code.
Writing TensorFlow code without using eager execution requires you to do a little metaprogramming — -you write a program that creates a graph,…