7

I have downloaded a pre-trained PoseNet model for Tensorflow.js (tfjs) from Google, so its a json file.

However, I want to use it on Android, so I need the .tflite model. Although someone has 'ported' a similar model from tfjs to tflite here, I have no idea what model (there are many variants of PoseNet) they converted. I want to do the steps myself. Also, I don't want to run some arbitrary code someone uploaded into a file in stackOverflow:

Caution: Be careful with untrusted code—TensorFlow models are code. See Using TensorFlow Securely for details. Tensorflow docs

Does anyone know any convenient ways to do this?

1 Answer 1

12

You can find out what tfjs format you have by looking in the json file. It often says "graph-model". The difference between them are here.

From tfjs graph model to SavedModel (more common)

Use tfjs-to-tf by Patrick Levin.

import tfjs_graph_converter.api as tfjs
tfjs.graph_model_to_saved_model(
               "savedmodel/posenet/mobilenet/float/050/model-stride16.json",
               "realsavedmodel"
            )

# Code below taken from https://www.tensorflow.org/lite/convert/python_api
converter = tf.lite.TFLiteConverter.from_saved_model("realsavedmodel")
tflite_model = converter.convert()

# Save the TF Lite model.
with tf.io.gfile.GFile('model.tflite', 'wb') as f:
  f.write(tflite_model)

From tfjs layers model to SavedModel

Note: This will only work for layers model format, not graph model format as in the question. I've written the difference between them here.


  1. Install and use tensorflowjs-convert to convert the .json file into a Keras HDF5 file (from another SO thread).

On mac, you'll face issues running pyenv (fix) and on Z-shell, pyenv won't load correctly (fix). Also, once pyenv is running, use python -m pip install tensorflowjs instead of pip install tensorflowjs, because pyenv did not change python used by pip for me.

Once you've followed the tensorflowjs_converter guide, run tensorflowjs_converter to verify it works with no errors, and should just warn you about Missing input_path argument. Then:

tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras tfjs_model.json hdf5_keras_model.hdf5
  1. Convert the Keras HDF5 file into a SavedModel (standard Tensorflow model file) or directly into .tflite file using the TFLiteConverter. The following runs in a Python file:
# Convert the model.
model = tf.keras.models.load_model('hdf5_keras_model.hdf5')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert() 
    
# Save the TF Lite model.
with tf.io.gfile.GFile('model.tflite', 'wb') as f:
f.write(tflite_model)

or to save to a SavedModel:

# Convert the model.
model = tf.keras.models.load_model('hdf5_keras_model.hdf5')
tf.keras.models.save_model(
    model, filepath, overwrite=True, include_optimizer=True, save_format=None,
    signatures=None, options=None
)
Sign up to request clarification or add additional context in comments.

8 Comments

In step 1, the tensorflowjs_converter fails to convert the PoseNet model. This GitHub issue implies that the format of PoseNet models (such as the one referenced by the OP) is tfjs_graph_model (not tf_layers_model) which cannot be converted to Keras. How did you manage to convert the PoseNet model to Keras?
Steps to reproduce: run Docker container python:3.8, then install TensorFlow.js pip install tensorflowjs, then try to convert tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras /posenet/js/model-stride16.json /posenet/keras/hdf5_keras_model.hdf5. Error message: File "/usr/local/lib/python3.8/site-packages/tensorflow/python/keras/layers/serialization.py", line 101, in deserialize layer_class_name = config['class_name'] KeyError: 'class_name'
Apologies Glen, I didn't know about the distinction between graph model and layers model formats. I was the OP, and basically did not answer my own question. My answer will only work for tfjs_layers_model
No worries. Meanwhile, I found a solution in TensorFlow.js Graph Model Converter which can reliably convert TensorFlow.js graph model to SavedModel format. From there, it should be quite straightforward to get TF Lite model, if needed.
Thanks @Glen, I've been able to create a model.tflite as well, and have updated the answer for both graph and layers model formats.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.