Unclear error from tensorflowjs_converter of USE large from TFHub

When I run

tensorflowjs_converter --input_format=tf_hub ‘tfhub.dev/google/universal-sentence-encoder-large/5’ use-large

(the URL in the command starts with https:// but I can’t post URLs to this forum)

It responds

ValueError: Signature “default” does on exist in the saved model

I believe I correctly followed the directions in tensorflow/tfjs/tree/master/tfjs-converter on Github

How can I proceed?

1 Like

you should specify the --signature_name for the converter CLI, since tfhub module seems to have default signature as ‘serving_default’ now, instead of ‘default’.

tensorflowjs_converter --input_format=tf_hub --signature_name=serving_default ‘https://tfhub.dev/google/universal-sentence-encoder-large/5’ /tmp/web_model

Thanks! It now says

ValueError: Unsupported Ops in the model before optimization
SegmentMean, ParallelDynamicStitch, StringJoin, SegmentSum, UnsortedSegmentSum, DynamicPartition, StaticRegexReplace

So am I right in assuming that without a huge effort I have to stick with the lite version of USE that has already been converted to TFJS? Or should I try converting the larger versions of USE to TensorFlow Lite which can be loaded into the browser?

1 Like

In TFJS if you find a model with not supported ops you can still run in native mode with Node.js:

1 Like

Thanks. For our project we need a zero-install solution and don’t want to maintain our own server. So browser-only solutions work for us. But we can live with long loading times and slower sentence encoding, hence our interest in higher accuracy USE models.

1 Like

As others have also suggested, if you get the above error detailing currently unsupported ops you have 2 options:

  1. Contribute the missing op to the TensorFlow.js project - we are open source and welcome pull requests to help gain parity with TF Python - there are thousands of ops out there so we have tried to convert over common ones initially but we are a younger team so may be some time before parity is achieved. Na Li wrote a nice document on contributing ops here: tfjs/CONTRIBUTING_MISSING_OP.md at master · tensorflow/tfjs · GitHub

  2. Change the Python model not to use the unsupported op(s) so it uses ones we do support. However this can be hard as it is not immediately obvious what command in the high level code caused that op to be used unless you know the model very well and how the functions used translate to lower level operations.

As you mentioned, we do have a Universal Sentence Encoder already converted available as a premade model: tfjs-models/universal-sentence-encoder at master · tensorflow/tfjs-models · GitHub if this is good enough for your needs.

1 Like