Hello, I would like to ask a question aboout On Device Training case. When I perform migration learning, I get the following error, which seems to be that these operators are not supported. That is, does the On Device Training case currently not support the use of convolution? How should I go about solving this problem? Looking forward to your reply！
“TFLite interpreter needs to link Flex delegate in order to run the model since it contains the following Select TFop(s): Flex ops: FlexBroadcastGradientArgs, FlexConv2DBackpropFilter, FlexFusedBatchNormGradV3, FlexFusedBatchNormV3, FlexReluGrad”
TensorFlow Lite builtin operator library only supports a limited number of TensorFlow operators, not every model is convertible. For details, refer to operator compatibility.
To allow conversion, users can enable the usage of certain TensorFlow ops in their TensorFlow Lite model.
In order to deploy a TensorFlow Lite model with on-device training built-in, here are the high level steps:
Build a TensorFlow model for training and inference
Convert the TensorFlow model to TensorFlow Lite format
Integrate the model in your Android app
Invoke model training in the app, similar to how you would invoke model inference.