Mobile Inference Models

Which models are suitable for tflite conversion in Tensorflow 2 Detection Zoo? Can you suggest a resource for me to train these models with my own data and convert them to tflite?

1 Like

@fatihatac,

Welcome to the Tensorflow Forum!

Currently, on-device inference is only optimized with SSD models.

Mobile-optimized detection models with a variety of latency and precision characteristics can be found in the Detection Zoo. Most of the download zips contain a model.tflite file. If there isn’t one, a TensorFlow Lite flatbuffer can be generated using these instructions. SSD models from the TF2 Object Detection Zoo can also be converted to TensorFlow Lite using the instructions here. It is important to note that detection models cannot be converted directly using the TensorFlow Lite Converter, since they require an intermediate step of generating a mobile-friendly source model.

Thank you!