Flex delegates support within application for ARM target is not working

I’ve been trying to get flex delegates support for tflite models by following instructions in here. So able to get tensorflowlite and tensorflowlite_flex libraries fine using bazel build(essentially using meta-tensorflow recipe). Then trying to use those libraries under CMake build system to get inference app.

Sample lines from CMake file

add_executable(inference inference.cpp)
target_link_libraries(inference tensorflowlite tensorflowlite_flex)

Then able to compile inference app fine but when trying to run it on the target edge device then that gives an error about not having TF Ops support basically same error which I shouldn’t be expecting after having flex library in place.

INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
ERROR: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
ERROR: Node number 1 (FlexConv2D) failed to prepare.
ERROR: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
ERROR: Node number 1 (FlexConv2D) failed to prepare.

Then I thought maybe I am trying to provide both libraries and that’s why it’s having issues against linking right library(in this case tensorflowllite_flex) so I tried removing tensorflowlite library from linking so cmake file will look like something as below.

add_executable(inference inference.cpp)
target_link_libraries(inference tensorflowlite_flex)

which results into linking errors as below…

[1/1] Linking CXX executable ~/inference
FAILED: ~/inference 
: && /usr/bin/aarch64-linux/aarch64-linux-g++   -mcpu=cortex-a53 -march=armv8-a+crc+crypto -fstack-protector-strong  -O2 -D_FORTIFY_SOURCE=2 -Wformat -Wformat-security -Werror=format-security --sysroot=/cortexa53-crypto-linux --sysroot=/cortexa53-crypto-linux -O2 -pipe -g -feliminate-unused-debug-types  -DFUZION_OS -fvisibility-inlines-hidden -Wdate-time -g -Wl,-O1 -Wl,--hash-style=gnu -Wl,--as-needed  -Wl,-z,relro,-z,now ~/CMakeFiles/inference.dir/inference.cpp.o -o ~/inference  -ltensorflowlite_flex && :
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o: in function `tflite::MutableOpResolver::~MutableOpResolver()':
/cortexa53-crypto-linux/usr/include/tensorflow/lite/mutable_op_resolver.h:63: undefined reference to `vtable for tflite::MutableOpResolver'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: /cortexa53-crypto-linux/usr/include/tensorflow/lite/mutable_op_resolver.h:63: undefined reference to `vtable for tflite::MutableOpResolver'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o: in function `main':
~/inference.cpp:65: undefined reference to `tflite::DefaultErrorReporter()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:65: undefined reference to `tflite::FlatBufferModel::BuildFromFile(char const*, tflite::ErrorReporter*)'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:69: undefined reference to `tflite::ops::builtin::BuiltinOpResolver::BuiltinOpResolver()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:71: undefined reference to `tflite::InterpreterBuilder::InterpreterBuilder(tflite::FlatBufferModel const&, tflite::OpResolver const&, tflite::InterpreterOptions const*)'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:71: undefined reference to `tflite::InterpreterBuilder::operator()(std::unique_ptr<tflite::Interpreter, std::default_delete<tflite::Interpreter> >*)'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:71: undefined reference to `tflite::InterpreterBuilder::~InterpreterBuilder()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:79: undefined reference to `tflite::Interpreter::AllocateTensors()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/inference.cpp:119: undefined reference to `tflite::Interpreter::Invoke()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o: in function `std::default_delete<tflite::Interpreter>::operator()(tflite::Interpreter*) const':
/cortexa53-crypto-linux/usr/include/c++/11.3.0/bits/unique_ptr.h:85: undefined reference to `tflite::Interpreter::~Interpreter()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o: in function `std::default_delete<tflite::FlatBufferModel>::operator()(tflite::FlatBufferModel*) const':
/cortexa53-crypto-linux/usr/include/c++/11.3.0/bits/unique_ptr.h:85: undefined reference to `tflite::FlatBufferModel::~FlatBufferModel()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o: in function `main':
~/inference.cpp:71: undefined reference to `tflite::InterpreterBuilder::~InterpreterBuilder()'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o:(.data.rel.ro._ZTIN6tflite3ops7builtin17BuiltinOpResolverE[_ZTIN6tflite3ops7builtin17BuiltinOpResolverE]+0x10): undefined reference to `typeinfo for tflite::MutableOpResolver'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o:(.data.rel.ro._ZTVN6tflite3ops7builtin17BuiltinOpResolverE[_ZTVN6tflite3ops7builtin17BuiltinOpResolverE]+0x10): undefined reference to `tflite::MutableOpResolver::FindOp(tflite::BuiltinOperator, int) const'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o:(.data.rel.ro._ZTVN6tflite3ops7builtin17BuiltinOpResolverE[_ZTVN6tflite3ops7builtin17BuiltinOpResolverE]+0x18): undefined reference to `tflite::MutableOpResolver::FindOp(char const*, int) const'
/usr/libexec/gcc/aarch64-linux/11.3.0/real-ld: ~/CMakeFiles/inference.dir/inference.cpp.o:(.data.rel.ro._ZTVN6tflite3ops7builtin17BuiltinOpResolverE[_ZTVN6tflite3ops7builtin17BuiltinOpResolverE]+0x48): undefined reference to `tflite::MutableOpResolver::MayContainUserDefinedOps() const'
collect2: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.

In mentioned inference.cpp file line no 65 is as below

auto model = tflite::FlatBufferModel::BuildFromFile(model_file_path);

Other way round, if I try to remove tensorflowlite_flex from linking then there is no impact and binary is still able to compile fine.

add_executable(inference inference.cpp)
target_link_libraries(inference tensorflowlite)

So definitely I am missing something in here, in regards to how to enable Flex delegates support while loading the model such that linker can recognize that and link appropriate library etc.

Though this exact process works fine for Mac os and don’t understand difference between these 2 different targets.

Any help would be appreciated, thanks a lot!

Same problem. Have you solved it?

reference this, I solved it. “–no-as-needed” add to gcc complier options.