Stability of Training with Converted TensorFlow Lite Models on Android Devices

I’m exploring the process of converting TensorFlow models to TensorFlow Lite for on-device training on Android mobile phones. My conversion process looks like this:

# Convert the model
converter = tf.lite.TFLiteConverter.from_saved_model(SAVED_MODEL_DIR)
converter.target_spec.supported_ops = [
    tf.lite.OpsSet.TFLITE_BUILTINS,  # enable TensorFlow Lite ops.
    tf.lite.OpsSet.SELECT_TF_OPS  # enable TensorFlow ops.
converter.experimental_enable_resource_variables = True
tflite_model = converter.convert()

My primary concern revolves around the stability and consistency of training results when using the transformed TFLite model on an Android device. Specifically, I’m wondering if training the same model with equivalent data on-device will yield significantly different results compared to its original TensorFlow counterpart. Issues like training instability or notable variances in performance metrics are my main focus.

It’s understood that some discrepancies might arise due to the inherent differences between TensorFlow and TensorFlow Lite environments. However, I’m curious about the extent to which these discrepancies might manifest, especially in the context of on-device training.

Does anyone here have experience or insights regarding the stability and consistency of training results with converted TensorFlow Lite models on mobile devices? Any shared knowledge or tips on ensuring more reliable outcomes in such scenarios would be greatly appreciated.

Hi @Hsin-Hsuan_Sung

If you are not quantizing during conversion you can be sure that the results from the .tflite model will be the same as the inference of the saved_model. Some differences in the 4th - 5th decimal place number will occur but I guess this will be insignificant for your case.

You can check for yourself doing inference with the TensorFlow Lite Interpreter API after conversion and checking the results with the inference from your original model. Look how you can verify the results before going into deployment for your .tflite model.

If you have any more questions please come back.