Ask questions[TF 2.0] tf.keras.optimizers.Adam

System information

  • TensorFlow version: 2.0.0-dev20190618
  • Python version: 3.6

Describe the current behavior I am trying to minimize a function using tf.keras.optimizers.Adam.minimize() and I am getting a TypeError

Describe the expected behavior First, in the TF 2.0 docs, it says the loss can be callable taking no arguments which returns the value to minimize. whereas the type error reads "'tensorflow.python.framework.ops.EagerTensor' object is not callable", which is not exactly the correct TypeError, it might be for some.

But the main issue is, I know I can do the same optimization by using GradientTape but I don't understand why I should or why the minimize() is not working. A similar issue I found related to this is linked: ( and a stack overflow solution for how you can solve a similar problem using gradient tape for reference: (

Code to reproduce the issue

import tensorflow as tf
import numpy as np

N  = 1000                               # Number of samples
n  = 4                                  # Dimension of the optimization variable
X = tf.Variable(np.random.randn(n, 1))  # Variables will be tuned by the optimizer
C = tf.constant(np.random.randn(N, n))  # Constants will not be tuned by the optimizer
D = tf.constant(np.random.randn(N, 1))

def f_batch_tensorflow(x, A, B):
    e = tf.matmul(A, x) - B
    return tf.reduce_sum(tf.square(e))

fx = f_batch_tensorflow(X, C, D)

adam_opt = tf.keras.optimizers.Adam()
optimizer = adam_opt.minimize(fx, X)

Other info / logs Following is the error I am getting:

TypeError                                 Traceback (most recent call last)
<ipython-input-20-225de189a3ff> in <module>()
      8 adam_opt = tf.keras.optimizers.Adam()
----> 9 optimizer = adam_opt.minimize(fx, X)
     10 print(optimizer)

1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/optimizer_v2/ in _compute_gradients(self, loss, var_list, grad_loss)
    347       if not callable(var_list):
--> 349       loss_value = loss()
    350     if callable(var_list):
    351       var_list = var_list()

TypeError: 'tensorflow.python.framework.ops.EagerTensor' object is not callable

Answer questions martinwicke

The function will have already been executed. You need to pass f_batch_tensorflow (or rather, a partial thereof) to minimize, not the result of executing it.

@dynamicwebpaige @tanzhenyu This is probably a common mistake, we should catch and rethrow (or type check the input) this specific error to explain what went wrong, maybe "EagerTensors cannot be passed to Optimizer.minimize. Please pass the callable representing the computation instead."


Related questions

ModuleNotFoundError: No module named 'tensorflow.contrib' hot 8
Error occurred when finalizing GeneratorDataset iterator hot 7
tensorflow-gpu CUPTI errors
Error loading tensorflow
ModuleNotFoundError: No module named 'tensorflow.contrib'
module 'tensorflow' has no attribute 'ConfigProto'
TF 2.0 'Tensor' object has no attribute 'numpy' while using .numpy() although eager execution enabled by default
When importing TensorFlow, error loading Hadoop
AttributeError: module &#39;tensorflow.python.framework.op_def_registry&#39; has no attribute &#39;register_op_list&#39;
tf.keras.layers.Conv1DTranspose ?
Lossy conversion from float32 to uint8. Range [0, 1]. Convert image to uint8 prior to saving to suppress this warning. hot 4
TF2.0 AutoGraph issue hot 4
Tf.Keras metrics issue hot 4
ModuleNotFoundError: No module named 'tensorflow.examples.tutorials' hot 4
module 'tensorflow.python._pywrap_tensorflow_internal' has no attribute 'TFE_NewContextOptions' hot 4
Github User Rank List