Model_maker predict_top_k gives different results to tflite inferencing

Using MM 0.4.2 train a classifier (efficientnet-lite) and use predict_top_k gives good results in an image classification problem.
I find there is no way to save a complete model so tensorflow can load it.
Instead use export to save as tflite and also as saved_model.
Inference using exact code from here gives completely different inference results. At least the tflite and saved_model give teh same numbers. I am using tensorflow 2.8.2.
So this is worthless for me. I have spent two days on this but nothing works.
If I were able to save the trained model completely so I could later load it and use predict_top_k that would be fine but Interpreter does not have this method.
I have changed to nightly MM and it still does not work (0.4.3.dev202210050510)
Any ideas? It seems to be an incompatability between MM and TF.

H iAndrew, welcome to the TF Forum

The model saved by MM is exactly the same model that you created and trained and it’s completely compatible with TF

The function predict_top_k that you are using is part of the wrapper used by Model Maker, it’s not part o the model and not exported with it, that’s why the interpreter (on-device I imagine?) can’t find it

The saved_model format (and the TFLite protobuffer) does not save all this extra functions that are added to the model, only the the ones specified during the save.
It is like this to keep the saved_model format portable so that it can be used everywhere. In this case, MM is not exporting the predict_top_k function

I’d try two things.

  • Given the code of the function here, I’d try to reproduce it on your app
  • Create a github issue on the MM repo requesting that they export this function or at least add it to the TFLite Task Library to make your life easier

hope it helps