StringLookup with optimization like TFLite doesn't seem to be supported (or other lookup)


So I have some categorical features and I run them through StringLookup layers to transform the data before training on a GPU. The model will be served on CPU so I can add those layers to the model after training. And that works and the new model runs fine (with StringLookup integrated in the model). But if I attempt to convert to TFLite and run it I get all manner of error messages, mostly about a convolutional layer. But the model converted and ran fine before I added the StringLookup. If I look online it seems that people can’t use TFLite (or ONNX) with StringLookup but then there are others that say “This was fixed in this issue” and point to some Github issue that doesn’t seem to solve it.

I just want to make sure that it actually isn’t supported as of TF 2.11
Because I would really, really appreciate it if it worked.

Edit: PS I assume that this also means I can’t add tokenizers to my models if I want things to run faster in slimmer docker images with less code overhead and fewer meta-data objects to manage.

According to the documentation, string look up is not enables on tflite yet: TensorFlow Lite and TensorFlow operator compatibility

I think the task library don’t use the stringlookup layer and have their on tokenizer. Maybe that would be a solution for you too?

I’m not an expert but maybe @ptruiz can add some insights here too

It’s okay. I made a big fat model with only this kind of preprocessing and then I map it on a dataset. I figure that this part probably wouldn’t be much faster if converted anyway. I would have liked to have gotten to a point where my deployed images don’t have full tensorflow libraries but not today (read: the devops guys would have liked that).

But… Does this mean that if I use the Keras lookup layers for data pre/post-processing then I can’t build a serving image without a full 4-5 GB Tensorflow install in it?

TFLite needs are different from TF Serving ones.

The lookup ops are not available on TFLite.
For serving it’s different. The official core ops should all work fine if they are on the saved_model

Ah ok. TF Serving is far from flexible enough for my needs. Thank you for the reply though.