Missing Interpreter class in selective build of TFLite Android

I’ve build the tensorflow-lite.aar and tensorflow-lite-select-tf-ops.aar with reference to Reduce TensorFlow Lite binary size for build a custom aar files for my Tensorflow Lite models on Android.
But unfortunately there were only org.tensorflow.lite.InterpreterApi inside of them. org.tensorflow.lite.Interpreter was not found so I could not use Interpreter#runSignature
Did i miss something? How could I build a custom aar files with org.tensorflow.lite.Interpreter?

Hi @Ruoxin_He

Take a look at this. I have created a Colab notebook for this purpose. You can follow the procedure and come back with some feedback.
IMPORTANT = To build tensorflow-lite-select-tf-ops.aar the free version of Colab is slow and will close after 6-12 hours. You have to use a Pro version or hook it up with a GCP VM Colab instance with at least 20 CPU cores. For only the tensorflow-lite.aar it is ok.

Regards

1 Like

@George_Soloupis Thank you for your quick reply! I have already built the tensorflow-lite.aar file but But after decompiling, I found that it is different from the official one, which doesn’t contain Interpreter class.

! [screenshot] (https://i.v2ex.co/24Zrdcm7.png)

Hi @Ruoxin_He

If you want to proceed with your project then use the prebuilt library for tensorflow-lite.aar that exists on Maven repository. Additionaly please fill a bug report at specific Github page for TensorFlow.

Regards

@George_Soloupis Thanks! After reading source code I have found that multi-signature support is a part of the experimental features. I have tried to add experimental = True to the tensorflow/lite/tools/build_aar.sh and the tensorflow-lite.aar seems to working (which works well with the maven version of the select-ops). But the selective build of tensorflow-lite-select-tf-ops.aar still crash.

It seems not a bug but a feature request :slight_smile:
I have submitted an issue here: https://github.com/tensorflow/tensorflow/issues/59941

1 Like

Well, The problem has been solved. Hope it can help others:

Purpose

Make a selective build of TFLite Android to reduce app size with an on-device-training model.

Main points

  1. Enable experimental feature: add experimental = True to the tflite_custom_android_library() of tensorflow/lite/tools/build_aar.sh. (On TF 2.11 it’s at line 73)

  2. Add the option echo -n "build --config=monolithic" >> /tensorflow_src/.bazelrc can avoid the _ZNK6google8protobuf7Message11GetTypeNameEv error for tensorflow-lite-select-tf-ops.aar

  3. Building TFLite can take a lot of CPU. I am using the C2-standard-60 with 60 vCPU and 240G RAM, 128G Disk on Google Compute Engine. Once build will take about 45 minutes and costs about $2.

2 Likes