Running Object Detection model with GPU using Task Library on Android

I have a trained Object Detection model using TFLite Model Maker, I can run the model on my Android app with Task Library but it is currently running on Phone’s CPU, how to let the model running on GPU with Task Library (I am running it from Java, not C++)

More over, I tried to run the model using Interpreter as the sample (examples/lite/examples/object_detection/android/lib_interpreter at master · tensorflow/examples · GitHub) but the result running on lib_interpreter is worse than the result running on Task Library, is it because of any additional tuning from Task Library when inferencing the model?

Hope you can support me.

Hi @Yen-Thanh_Le

At this moment there is no option to use Task library with GPU. TFLite team is working on this.
Also what do you mean by:

“the result running on lib_interpreter is worse than the result running on Task Library”

please define worse…

Sorry about this confuse word, I mean the detection result from Tensorflow Interpreter is not as accurate as the result from Task Library. For example, I trained a model for Vehicle License Plate detection, while the Task Library can output the detection box that cover 100% the plate, the result from Interpreter output the detection box that cover 1/3 of the plate.

We have been updating the Object detection example the past month. It seems that we have to use 2 different getTransformationMatrix methods one for Task Library and one for Interpreter. Or we have to find an abstract class for drawing the boxes like:

Now it is up to you to change the method mentioned above as the models work fine and the problem is afterwards when you want to render results on sreen.

If you have the project online please paste the link to review the problem.


I know this is an old thread, but I’m trying to do something similar as OP.

I’m following the object detector tutorial ( Object detection with Android  |  TensorFlow Lite) and I’ve got the App built and running on a device with Delegate set to CPU. However, when I select GPU for the Delegate within the App, I get an in-app error of “GPU is not supported on this device.”

Is it still true that “there is no option to use Task library with GPU”, per 2 years ago?

I imagine my mistake is something really basic, but I’ve tried it on an S22, S23, Pixel 6, and Pixel 7 all with the same result.

Any guidance appreciated.