Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tensorflow-1 in docker is crushing over and over #1

Open
NofLevi10root opened this issue Nov 8, 2022 · 1 comment
Open

tensorflow-1 in docker is crushing over and over #1

NofLevi10root opened this issue Nov 8, 2022 · 1 comment

Comments

@NofLevi10root
Copy link

Every time I'm trying to run the docker.
I get this message:
deeppass-tensorflow-1 | 2022-11-08 15:00:10.067722: I external/org_tensorflow/tensorflow/core/util/util.cc:169] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0.
deeppass-tensorflow-1 | 2022-11-08 15:00:10.075827: I tensorflow_serving/model_servers/server_core.cc:465] Adding/updating models.
deeppass-tensorflow-1 | 2022-11-08 15:00:10.075860: I tensorflow_serving/model_servers/server_core.cc:594] (Re-)adding model: password
deeppass-tensorflow-1 | 2022-11-08 15:00:10.267574: I tensorflow_serving/core/basic_manager.cc:740] Successfully reserved resources to load servable {name: password version: 1}
deeppass-tensorflow-1 | 2022-11-08 15:00:10.267642: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: password version: 1}
deeppass-tensorflow-1 | 2022-11-08 15:00:10.267655: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: password version: 1}
deeppass-tensorflow-1 | 2022-11-08 15:00:10.273474: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:45] Reading SavedModel from: /models/bilstm_model/1
deeppass-tensorflow-1 | 2022-11-08 15:00:10.393738: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:89] Reading meta graph with tags { serve }
deeppass-tensorflow-1 | 2022-11-08 15:00:10.393788: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:130] Reading SavedModel debug info (if present) from: /models/bilstm_model/1
deeppass-tensorflow-1 | 2022-11-08 15:00:10.395129: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F AVX512_VNNI FMA
deeppass-tensorflow-1 | To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
deeppass-tensorflow-1 | 2022-11-08 15:00:10.597273: I external/org_tensorflow/tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:354] MLIR V1 optimization pass is not enabled
deeppass-tensorflow-1 | 2022-11-08 15:00:10.671806: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:229] Restoring SavedModel bundle.
deeppass-tensorflow-1 | terminate called after throwing an instance of 'std::bad_alloc'
deeppass-tensorflow-1 | what(): std::bad_alloc
deeppass-tensorflow-1 | /usr/bin/tf_serving_entrypoint.sh: line 3: 7 Aborted tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=${MODEL_NAME} --model_base_path=${MODEL_BASE_PATH}/${MODEL_NAME} "$@"
deeppass-tensorflow-1 exited with code 134

@Retrospected
Copy link

I was facing the same issue but figured this is due to a bug in the tensorflow/serving container also mentioned here: tensorflow/serving#2048

as a workaround for now, you can use an older version of the image by adding the version 2.8.2 tag to the docker-compose.yml

-    image: tensorflow/serving:latest
+    image: tensorflow/serving:2.8.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants