When I use Keras to predict behind a web service, it occurred an error. and the error message is like flowing:

self._make_predict_function()
  File "/usr/local/lib/python3.4/dist-packages/keras/engine/training.py", line 679, in _make_predict_function
    **self._function_kwargs)
  File "/usr/local/lib/python3.4/dist-packages/keras/backend/tensorflow_backend.py", line 615, in function
    return Function(inputs, outputs, updates=updates)
  File "/usr/local/lib/python3.4/dist-packages/keras/backend/tensorflow_backend.py", line 589, in __init__
    with tf.control_dependencies(self.outputs):
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 3192, in control_dependencies
    return get_default_graph().control_dependencies(control_inputs)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 2993, in control_dependencies
    c = self.as_graph_element(c)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 2291, in as_graph_element
    raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("Sigmoid_2:0", shape=(?, 17), dtype=float32) is not an element of this graph.

This error appears when using tensorflow backend. After search this error, I found it's a bug when using tensorflow graph cross threads.

 

Here's how to fix the bug:

#Right after loading or constructing your model, save the TensorFlow graph:
graph = tf.get_default_graph()
  
#In the other thread (or perhaps in an asynchronous event handler), do:
global graph
with graph.as_default():
    (... do inference here ...)

 

refer to: https://github.com/fchollet/keras/issues/2397

相关文章:

  • 2021-12-25
  • 2022-01-07
  • 2021-06-26
  • 2022-02-08
  • 2022-12-23
  • 2022-01-23
  • 2021-10-21
  • 2021-07-16
猜你喜欢
  • 2021-11-03
  • 2022-12-23
  • 2021-04-06
  • 2021-08-29
  • 2021-11-04
  • 2021-07-27
  • 2021-07-26
相关资源
相似解决方案