Issue with Flex ops: FlexCombinedNonMaxSuppression, FlexResizeBilinear

Hello Everyone,

I am currently working on coverting tensorflow model to TFLite model. I am facing the following issue while converting the model and I coulnt find a solution for this. Anyone faced the same issue and find some solution ?

2024-03-12 11:20:22.181279: W tensorflow/compiler/mlir/lite/python/] Ignored output_format.
2024-03-12 11:20:22.181335: W tensorflow/compiler/mlir/lite/python/] Ignored drop_control_dependency.
2024-03-12 11:20:22.182328: I tensorflow/cc/saved_model/] Reading SavedModel from: ./export/1/
2024-03-12 11:20:22.331948: I tensorflow/cc/saved_model/] Reading meta graph with tags { serve }
2024-03-12 11:20:22.332007: I tensorflow/cc/saved_model/] Reading SavedModel debug info (if present) from: ./export/1/
2024-03-12 11:20:22.888341: I tensorflow/cc/saved_model/] Restoring SavedModel bundle.
2024-03-12 11:20:24.772238: I tensorflow/cc/saved_model/] Running initialization op on SavedModel bundle at path: ./export/1/
2024-03-12 11:20:25.905722: I tensorflow/cc/saved_model/] SavedModel load for tags { serve }; Status: success: OK. Took 3723399 microseconds.
2024-03-12 11:20:28.243562: I tensorflow/compiler/mlir/tensorflow/utils/] disabling MLIR crash reproducer, set env var MLIR_CRASH_REPRODUCER_DIRECTORY to enable.
2024-03-12 11:20:32.225859: W tensorflow/compiler/mlir/lite/] TFLite interpreter needs to link Flex delegate in order to run the model since it contains the following Select TFop(s):
Flex ops: FlexCombinedNonMaxSuppression, FlexResizeBilinear
tf.CombinedNonMaxSuppression(tensor<?x49104x1x4xf32>, tensor<?x?x1xf32>, tensor, tensor, tensor, tensor) → (tensor<?x300x4xf32>, tensor<?x300xf32>, tensor<?x300xf32>, tensor<?xi32>) : {clip_boxes = false, device = “”, pad_per_class = false}
tf.ResizeBilinear(tensor<?x?x?x3xui8>, tensor<2xi32>) → (tensor<?x?x?x3xf32>) : {align_corners = false, device = “”, half_pixel_centers = true}
See instructions: Select TensorFlow operators  |  TensorFlow Lite

Thank you very much in advance.