Hello TF experts!
I’m trying to run a TF-Lite model (Armv7, Linux, C++) converted from a ONNX/TF model.
I already did it for a simple DNN, but since I’m trying a LSTM I get errors at runtime.
I have been able to convert the model to TF-Lite by following these instructions: Select TensorFlow operators | TensorFlow Lite
I have also updated the TF-Lite library to compile with content of “tensorflow/lite/delegates/flex” folder, but I still get this error at runtime:
“ERROR: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
ERROR: Node number 1 (FlexVarHandleOp) failed to prepare.”
I’m not building TF-Lite with Bazel (specific build env), maybe there are some additionnal things to do?
Many thanks for any advice onn my issue,