Error implementing my custom tflite model into flutter

Hi everyone, I have a custom tflite model I am trying to implement on flutter using tflite_flutter and it gives the following error.
Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding “org.tensorflow:tensorflow-lite-select-tf-ops” dependency. See instructions: Seleziona gli operatori TensorFlow  |  TensorFlow Lite
E/tflite (12553): Node number 6 (FlexTensorListReserve) failed to prepare."

Is there anyone who has encountered this issue especially with flutter and how can it be resolved?

Thank you.

Hey, I have the same issue, have you already resolved it ?

The error you’re encountering, “Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter,” indicates that your custom TensorFlow Lite (TFLite) model uses operations that are not natively supported by the standard TFLite interpreter. Specifically, the FlexTensorListReserve operation from your error message is a TensorFlow operation that isn’t included in the default TFLite op set and requires the TensorFlow Lite Select TensorFlow (Flex) delegate to run.

Here’s how you can address this issue:

For Android:

  1. Add TensorFlow Lite Select TensorFlow Ops Dependency: As the error message suggests, you need to add a dependency to your Android project’s build.gradle file to include the Select TensorFlow ops library:

gradleCopy code

dependencies {
    // Other dependencies
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly' // Use the latest version
    implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly' // Use the latest version
}

Replace 0.0.0-nightly with the actual version you’re using, or keep it to use the nightly build which is the most up-to-date.
2. Flex Delegate: When initializing the TFLite interpreter in your Flutter app, make sure to apply the Flex delegate. This might require custom platform-specific code as Flutter’s tflite_flutter package might not directly support Flex delegates. You may need to write some Android-specific code using MethodChannels to achieve this.

For iOS:

For iOS, using TensorFlow ops that are not included in the TFLite standard set is more challenging because there’s no equivalent of the “Select TensorFlow ops” library for iOS. Solutions might involve:

  • Custom Build: Creating a custom build of the TensorFlow Lite library that includes the necessary TensorFlow ops. This is quite complex and requires a deep understanding of TensorFlow Lite’s build process.
  • Model Modification: Modifying the model to avoid using TensorFlow ops that are not supported by TFLite. This might involve changing the model architecture or replacing unsupported operations with ones that are supported by TFLite.

General Recommendations:

  • Simplify Model: If possible, modify your TensorFlow model to avoid using complex operations that require the Flex delegate. This can make the model more compatible with TFLite and improve performance on mobile devices.
  • Check TFLite Support: Ensure that all operations used in your model are supported by TFLite. TensorFlow provides a list of supported ops in TFLite. Consider replacing unsupported ops with supported alternatives.
  • Update Libraries: Make sure you’re using the latest versions of TensorFlow, TFLite, and the tflite_flutter package, as support for operations and features is continually improving.

Addressing this issue might require some trial and error, especially if you need to modify your model or implement custom platform-specific solutions for Flex delegate support in Flutter.