I have a tensorflow-lite model that requires some TensorFlow_Text ops (CaseFoldUTF8 & RegexSplitWithOffsets).
While using python’s tflite interpreter, I can use
import tensorflow_text as tf_text
to load the necessary tensorflow_text binaries for inference.
But by deploying on Android, the monolithic AARs libraries from Maven Central (both lite and select-ops) do not include tensorflow_text ops, which gives me errors such as:
Op type not registered 'CaseFoldUTF8' in binary running on localhost. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
Looking at the tflite’s guide from Supported Select TensorFlow operators | TensorFlow Lite , it is stated that, ‘On the runtime side, it is also required to link the TensorFlow Text or SentencePiece library into the final app or binary.’
The question is, how do I build AARs that includes TensorFlow-Text ops?
I have tried using using the tensorflow/lite/tools/build_arr.sh to build the AAR to my model, and it fails because of similar errors like “Op CaseFoldUTF8 not found”.
Also, while I haven’t tried on iOS binaries, do I need to specially build the iOS frameworks that include TensorFlow Text ops too? If so, how do I do that?