Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Python] Add saved_weights example to tf notebook #26472

Merged
merged 5 commits into from
May 1, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 19 additions & 12 deletions examples/notebooks/beam-ml/run_inference_tensorflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -74,18 +74,16 @@
"If your model uses `tf.Example` as an input, see the [Apache Beam RunInference with `tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb) notebook.\n",
"\n",
"There are three ways to load a TensorFlow model:\n",
"1. Using a path to the saved model.\n",
"2. Using a path to the saved weights of model.\n",
"3. Using a URL for pretrained model on TensorFlow Hub (See this [notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb))\n",
"1. Provide a path to the saved model.\n",
"2. Provide a path to the saved weights of the model.\n",
"3. Provide a URL for pretrained model on TensorFlow Hub. For an example workflow, see [Apache Beam RunInference with TensorFlow and TensorFlow Hub](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb).\n",
"\n",
"This notebook demonstrates the following steps:\n",
"- Build a simple TensorFlow model.\n",
"- Set up example data.\n",
"- Run those examples with the built-in model handlers using:\n",
" * Saved Model\n",
" * Saved Weights\n",
"\n",
" and get a prediction inside an Apache Beam pipeline.\n",
"- Run those examples with the built-in model handlers using one of the following methods, and then get a prediction inside an Apache Beam pipeline.:\n",
" * a saved model\n",
" * saved weights\n",
"\n",
"For more information about using RunInference, see [Get started with AI/ML pipelines](https://beam.apache.org/documentation/ml/overview/) in the Apache Beam documentation."
],
Expand Down Expand Up @@ -227,7 +225,7 @@
"x = numpy.arange(0, 100) # Examples\n",
"y = x * 5 # Labels\n",
"\n",
"# create_model builds a simple linear regression model.\n",
"# Use create_model to build a simple linear regression model.\n",
"# Note that the model has a shape of (1) for its input layer and expects a single int64 value.\n",
"def create_model():\n",
" input_layer = keras.layers.Input(shape=(1), dtype=tf.float32, name='x')\n",
Expand Down Expand Up @@ -332,7 +330,16 @@
{
"cell_type": "markdown",
"source": [
"Instead of saving the entire model, you can just save the model weights for inference. This is slightly lightweight than saving and loading the entire model. However, you need to pass the function to build TensorFlow model to the `TFModelHandlerNumpy` / `TFModelHandlerTensor` class along with `ModelType.SAVED_WEIGHTS`."
"Instead of saving the entire model, you can just [save the model weights for inference](https://www.tensorflow.org/guide/keras/save_and_serialize#saving_loading_only_the_models_weights_values). This is useful in cases when you're using the model just for inference and won't need any compilation information or optimizer state. This also allows loading the weights with new model in case of transfer learning applications.\n",
riteshghorse marked this conversation as resolved.
Show resolved Hide resolved
"\n",
"With this approach, you need to pass the function to build TensorFlow model to the `TFModelHandler` class you intend to use (`TFModelHandlerNumpy` / `TFModelHandlerTensor`) along with `model_type=ModelType.SAVED_WEIGHTS`.\n",
riteshghorse marked this conversation as resolved.
Show resolved Hide resolved
"\n",
"\n",
"\n",
"```\n",
"model_handler = TFModelHandlerNumpy(path_to_weights, model_type=ModelType.SAVED_WEIGHTS, create_model_fn=build_tensorflow_model)\n",
"```\n",
"\n"
],
"metadata": {
"id": "g_qVtXPeUcMS"
Expand All @@ -353,7 +360,7 @@
"cell_type": "markdown",
"source": [
"## Run the pipeline\n",
"#### Use the following code to run the pipeline by specifying path to the trained TensorFlow model."
"Use the following code to run the pipeline by specifying path to the trained TensorFlow model."
],
"metadata": {
"id": "0a1zerXycQ0z"
Expand Down Expand Up @@ -439,7 +446,7 @@
{
"cell_type": "markdown",
"source": [
"#### Use the following code to run the pipeline with the saved weights of a TensorFlow model.\n",
"Use the following code to run the pipeline with the saved weights of a TensorFlow model.\n",
"\n",
"To load the model with saved weights, the `TFModelHandlerNumpy` class requires a `create_model` function that builds and returns a TensorFlow model that is compatible with the saved weights."
],
Expand Down