juli 17, 2018 —
                                          
Posted by Alex Wiltschko, Dan Moldovan, Wolff Dobson
We’d like to tell you about a new TensorFlow feature called “AutoGraph”. AutoGraph converts Python code, including control flow, print() and other Python-native features, into pure TensorFlow graph code.
Writing TensorFlow code without using eager execution requires you to do a little metaprogramming — -you write a program that creates a graph,…

print() and other Python-native features, into pure TensorFlow graph code.if and while, or ones that have side effects like print(), or accept structured input.def huber_loss(a):
  if tf.abs(a) <= delta:
    loss = a * a / 2
  else:
    loss = delta * (tf.abs(a) - delta / 2)
  return losstf.cond(), but that can be tedious and difficult to implement. AutoGraph can do this conversion automatically for you, keeping the ease of eager programming while reaping the performance benefit of graph-based execution.autograph.convert(), and AutoGraph will automatically generate graph-ready code.@autograph.convert()
def huber_loss(a):
  if tf.abs(a) <= delta:
    loss = a * a / 2
  else:
    loss = delta * (tf.abs(a) - delta / 2)
  return lossdef tf__huber_loss(a):
  with tf.name_scope('huber_loss'):
    def if_true():
      with tf.name_scope('if_true'):
        loss = a * a / 2
        return loss,
    def if_false():
      with tf.name_scope('if_false'):
        loss = delta * (tf.abs(a) - delta / 2)
        return loss,
    loss = ag__.utils.run_cond(tf.less_equal(tf.abs(a), delta), if_true,
        if_false)
    return losswith tf.Graph().as_default():  
  x_tensor = tf.constant(9.0)
  # The converted function works like a regular op: tensors in, tensors out.
  huber_loss_tensor = huber_loss(x_tensor)
  with tf.Session() as sess:
    print('TensorFlow result: %2.2f\n' % sess.run(huber_loss_tensor)).to_graph() function to turn this into a graph.def collatz(a):
    counter = 0
    while a != 1:
        if a % 2 == 0:
            a = a // 2
        else:
            a = 3 * a + 1
        counter = counter + 1
    return counter
graph_mode_collatz = autograph.to_graph(collatz)
# The code is human-readable, too
print(autograph.to_code(collatz))
collatz_tensor = graph_mode_collatz(tf.constant(n))def f(n):
  if n >= 0:
    while n < 5:
      n += 1
      print(n)
  return nset_element_type and stack.def f(n):
  z = []
  # We ask you to tell us the element dtype of the list
  autograph.set_element_type(z, tf.int32)
  for i in range(n):
    z.append(i)
  # when you're done with the list, stack it
  # (this is just like np.stack)
  return autograph.stack(z) break, continue, and even print and assert. When converted, this snippet’s Python assert converts to a graph that uses the appropriate tf.Assert.def f(x):
  assert x != 0, 'Do not pass zero!'
  return x * xsess.run() call. This could be useful in situations where you need to pass an entire training loop to an accelerator, rather than manage training via a CPU controller.if and while.tf.contrib.eager.defun. This requires you to use graph TensorFlow ops like tf.cond(). In the future, AutoGraph will be seamlessly integrated with defun to allow authoring graph code in plain eager-style Python. When that implementation is available, you can expect to use AutoGraph to speed up hotspots by selectively turning eager code into graph fragments.contrib, but we expect to move it into core TensorFlow soon. 
juli 17, 2018
 —
                                  
Posted by Alex Wiltschko, Dan Moldovan, Wolff Dobson
We’d like to tell you about a new TensorFlow feature called “AutoGraph”. AutoGraph converts Python code, including control flow, print() and other Python-native features, into pure TensorFlow graph code.
Writing TensorFlow code without using eager execution requires you to do a little metaprogramming — -you write a program that creates a graph,…