CMake build Tensorflow Lite with Flex

Hi, I want to run simple conv2d layer training using C++ API of TF Lite. I’ve used examples/minimal project as reference. Calling trainer->invoke() leads to next error:

ERROR: TensorFlow Lite Error: Select TensorFlow op(s), included in the given model is(are) not supported by this interpreter. Make sure you apply/link Flex delegate before inference. For the Android, it can be resolved by adding “org.tensorflow:tensorflow-lite-select-tf-ops” dependency…

ERROR: Node number 48 (FlexConv2DBackpropFilter) failed to prepare.

I didn’t find any good tutorial on Flex delegate. Please, tell me, how to build TF Lite with Flex using CMake and how to make this interpreter use Flex’s implementation of this node. Thanks

This post was flagged by the community and is temporarily hidden.

@mymind1919 I have been trying to achieve same thing and on Mac os, I am able to link and generate correct binary/executable by using below sample cmake snippet.

To build required libraries…

bazel build -c opt --config=monolithic tensorflow/lite:libtensorflowlite
bazel build -c opt --config=monolithic tensorflow/lite:libtensorflowlite_flex

Assuming binary file name is inference.cpp and then need to link these libraries.

  1. tensorflowlite
  2. tensorflowlite_flex
add_executable(inference inference.cpp)
target_link_libraries(inference tensorflowlite tensorflowlite_flex)

That enables Flex delegate support for Mac OS but same thing doesn’t work when I am trying to build it for ARM.