TFLite Python Installation with Flex Delegate

Hi everyone,

I am currently working on deploying and fine-tuning a TFLite model. My goal is to be able to run inference (which currently already works) as well as continue training using only TFLite. To achieve this goal, I have followed this tutorial.

I have build TFlite Wheel natively on my edge device with cmake, following instructions here : টেনসরফ্লো লাইট পাইথন হুইল প্যাকেজ তৈরি করুন  |  TensorFlow Lite

However, when running my script, I encounter the following problem.
Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding “org.tensorflow:tensorflow-lite-select-tf-ops” dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_selectNode number 32 (FlexRestore) failed to prepare.

By doing some desktop research, I found out that Flex Delegate Operations need to be installed to continue the training with TFLite only. Is installing the Flex Delegate Operations when building TFLite with CMake on Python possible?

I am asking this question because I found very little to no guidance online on how to enable Flex Delegate Operations in Python. The only guide I found is this one but it only mentions an experimental method based on Bazel and I would highly prefer to use CMake. Is this possible? (I am using Python 3.10.10 and I cannot install the full TF package).

Thank you very much in advance.

@Matteo,

Welcome to the Tensorflow Forum,

For Python, TensorFlow Lite with select TensorFlow ops will be installed automatically with the TensorFlow pip package. You can also choose to only install the TensorFlow Lite Interpreter pip package by using tflite_runtime which comes with reduced installation size.

Thank you!

Hey there, thanks for your answer. I cannot install the full TensorFlow pip package because it is to large for my edge device. Is it possible to install TFLite with the select TF ops?

@Matteo,

As mentioned earlier, you can choose to only install the TensorFlow Lite Interpreter pip package

!pip install tflite-runtime

Thank you!

Hey there, I understand that but that package does not include the “select Tensorflow ops” right?

@Matteo,

Yes, the tflite-runtime does not include the select ops. Unfortunately, we may have to use entire tensorflow package for using select ops. The select ops feature is provided for the use cases where the memory constraints are not strict.

If any memory constraints are there for your use case, make sure your model has only TFLite builtin ops to effectively use the TFLite. If any OP which you feel important for your use case is missing in TF Lite builtin ops, please create a feature request in tensorflow github repo.

Thank you!

I am using these statements. converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS,

I am getting following error. How do I deal with that?

Error code: ERROR_NEEDS_FLEX_OPS]
:0: error: failed while converting: ‘main’:
Some ops are not supported by the native TFLite runtime, you can enable TF kernels fallback using TF Select. See instructions: Select TensorFlow operators  |  TensorFlow Lite
TF Select ops: CombinedNonMaxSuppression, ResizeBilinear
Details:
tf.CombinedNonMaxSuppression(tensor<?x49104x1x4xf32>, tensor<?x?x1xf32>, tensor, tensor, tensor, tensor) → (tensor<?x300x4xf32>, tensor<?x300xf32>, tensor<?x300xf32>, tensor<?xi32>) : {_cloned = true, clip_boxes = false, device = “”, pad_per_class = false}
tf.ResizeBilinear(tensor<?x?x?x3xui8>, tensor<2xi32>) → (tensor<?x?x?x3xf32>) : {align_corners = false, device = “”, half_pixel_centers = true}

@Dibya_Jyoti I am facing the same issue, were you able to fix this issue?