Integrating YOLOv5 tflite Models into Android

Hello,

I’m new to TensorFlow Lite and currently working on integrating a YOLOv5 model, converted to tflite by my team, into an Android application. I’ve been provided with two different versions of the model, but I’m facing challenges integrating either of them. Unfortunately, I couldn’t find any resources to help me with this integration.

Here are visuals of the two model versions:

First Model Option

First Model Option

Second Model Option

Is there anyone who could assist me with this or recommend any resources that might help? Your guidance and suggestions would be greatly appreciated.

Thank you.

Hi @Jim-Morrison ,

You can use both files if you are familiar with Machine Learning and can read the python code of your team. You can see there how they are doing the inference and convert the code into java/kotlin. You can also ask them if the model contains metadata and if they have used TensorFlow/Keras or Mediapipe to train the model.
If you are not familiar then you probably have to start with checking the TensorFlow Lite guide here.
Inside there there is an Object Detection example. Check the guide how you can use TensorFlow Lite for inference before you jump into the Android code.
One of the models this example uses is pretty much like yours:

Tag me if you have more questions.

Regards

mobilenetv1.tflite

I’ve successfully integrated the model mentioned earlier (mobilenetv1.tflite), which is similar to mine, as depicted in the attached image. With this integrated model, I can generate the expected output, as indicated in the image. The integration with this model is functioning smoothly.

myFirstModel

However, when I add ‘myFirstModel’ to the same project and use the sample code suggested by Android Studio, the application throws an exception. When I add my ‘myFirstModel’ model to the example in the doc, the app throws an exception again.

The exception I get when trying to integrate the “myFirstMode.tflite” model into the project where I integrated the “mobilenetv1.tflite” model

java.lang.IllegalArgumentException: Label number 1 mismatch the shape on axis 1
                             	at org.tensorflow.lite.support.common.internal.SupportPreconditions.checkArgument(SupportPreconditions.java:104)
                             	at org.tensorflow.lite.support.label.TensorLabel.<init>(TensorLabel.java:87)
                             	at org.tensorflow.lite.support.label.TensorLabel.<init>(TensorLabel.java:105)
                             	at com.moveon.objectdetectionstudy.ml.MyFirstModel$Outputs.getOutputAsCategoryList(MyFirstModel.java:105)
                             	at com.moveon.objectdetectionstudy.ObjectDetectionAnalyzer.analyze(ObjectDetectionAnalyzer.kt:31)
                             	at androidx.camera.core.ImageAnalysis.lambda$setAnalyzer$2(ImageAnalysis.java:481)
                             	at androidx.camera.core.ImageAnalysis$$ExternalSyntheticLambda2.analyze(D8$$SyntheticClass)
                             	at androidx.camera.core.ImageAnalysisAbstractAnalyzer.lambda$analyzeImage$0$androidx-camera-core-ImageAnalysisAbstractAnalyzer(ImageAnalysisAbstractAnalyzer.java:286)
                             	at androidx.camera.core.ImageAnalysisAbstractAnalyzer$$ExternalSyntheticLambda1.run(D8$$SyntheticClass)
                             	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
                             	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
                             	at java.lang.Thread.run(Thread.java:761)

The exception I got when trying to integrate the “myFirstMode.tflite” model into the sample project in the document

Error getting native address of native library: task_vision_jni
                             java.lang.RuntimeException: Error occurred when initializing ObjectDetector: Input tensor has type kTfLiteFloat32: it requires specifying NormalizationOptions metadata to preprocess input images.
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector.initJniWithModelFdAndOptions(Native Method)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector.access$000(ObjectDetector.java:88)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector$1.createHandle(ObjectDetector.java:156)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector$1.createHandle(ObjectDetector.java:149)
                             	at org.tensorflow.lite.task.core.TaskJniUtils$1.createHandle(TaskJniUtils.java:70)
                             	at org.tensorflow.lite.task.core.TaskJniUtils.createHandleFromLibrary(TaskJniUtils.java:91)
                             	at org.tensorflow.lite.task.core.TaskJniUtils.createHandleFromFdAndOptions(TaskJniUtils.java:66)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector.createFromFileAndOptions(ObjectDetector.java:147)
                             	at org.tensorflow.lite.examples.objectdetection.ObjectDetectorHelper.setupObjectDetector(ObjectDetectorHelper.kt:96)
                             	at org.tensorflow.lite.examples.objectdetection.ObjectDetectorHelper.detect(ObjectDetectorHelper.kt:107)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment.detectObjects(CameraFragment.kt:289)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment.bindCameraUseCases$lambda-9$lambda-8(CameraFragment.kt:264)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment.$r8$lambda$trA1WWYM4Jg8atYGW5F6kpxoOW8(CameraFragment.kt)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment$$ExternalSyntheticLambda7.analyze(D8$$SyntheticClass)
                             	at androidx.camera.core.ImageAnalysis.lambda$setAnalyzer$2(ImageAnalysis.java:476)
                             	at androidx.camera.core.ImageAnalysis$$ExternalSyntheticLambda0.analyze(D8$$SyntheticClass)
                             	at androidx.camera.core.ImageAnalysisAbstractAnalyzer.lambda$analyzeImage$0$androidx-camera-core-ImageAnalysisAbstractAnalyzer(ImageAnalysisAbstractAnalyzer.java:283)
                             	at androidx.camera.core.ImageAnalysisAbstractAnalyzer$$ExternalSyntheticLambda1.run(D8$$SyntheticClass)
                             	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
                             	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
                             	at java.lang.Thread.run(Thread.java:761)

mySecondModel

When I add my ‘mySecondModel’ model to the same project and use the sample code suggested by Android Studio, the application throws an exception. When I add my ‘mySecondModel’ model to the example in the document, the application throws an exception again.

In addition, since I could run the “mobilenetv1” model, I told my team “I can run a similar model” and my team gave me the “mySecondModel” model. The only difference between these two models is that one of the outputs of “mySecondModel” is of type “Int32”. We could not convert this “Int32” type part to “float32” type. I am hoping that if this “Int32” type part becomes “float32” the problem will be solved, but we are at this point. If you have a suggestion on this issue, we would also like to listen to it.


The exception I get when trying to integrate the “mySecondMode.tflite” model into the project where I integrated the “mobilenetv1.tflite” model

Type mismatch: inferred type is Triple<RectF, String, TensorBuffer> but Triple<RectF, String, Int> was expected

The exception I got when trying to integrate the “mySecondMode” model into the sample project in the document

16:06:58.187 tflite       E  Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
16:06:58.187 tflite       E  Node number 132 (FlexTensorListReserve) failed to prepare.
16:06:58.188 tflite       E  Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
16:06:58.188 tflite       E  Node number 132 (FlexTensorListReserve) failed to prepare.
16:06:58.191 Task...tils  E  Error getting native address of native library: task_vision_jni
                             java.lang.IllegalStateException: Error occurred when initializing ObjectDetector: AllocateTensors() failed.
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector.initJniWithModelFdAndOptions(Native Method)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector.access$000(ObjectDetector.java:88)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector$1.createHandle(ObjectDetector.java:156)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector$1.createHandle(ObjectDetector.java:149)
                             	at org.tensorflow.lite.task.core.TaskJniUtils$1.createHandle(TaskJniUtils.java:70)
                             	at org.tensorflow.lite.task.core.TaskJniUtils.createHandleFromLibrary(TaskJniUtils.java:91)
                             	at org.tensorflow.lite.task.core.TaskJniUtils.createHandleFromFdAndOptions(TaskJniUtils.java:66)
                             	at org.tensorflow.lite.task.vision.detector.ObjectDetector.createFromFileAndOptions(ObjectDetector.java:147)
                             	at org.tensorflow.lite.examples.objectdetection.ObjectDetectorHelper.setupObjectDetector(ObjectDetectorHelper.kt:96)
                             	at org.tensorflow.lite.examples.objectdetection.ObjectDetectorHelper.detect(ObjectDetectorHelper.kt:107)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment.detectObjects(CameraFragment.kt:289)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment.bindCameraUseCases$lambda-9$lambda-8(CameraFragment.kt:264)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment.$r8$lambda$trA1WWYM4Jg8atYGW5F6kpxoOW8(CameraFragment.kt)
                             	at org.tensorflow.lite.examples.objectdetection.fragments.CameraFragment$$ExternalSyntheticLambda7.analyze(D8$$SyntheticClass)
                             	at androidx.camera.core.ImageAnalysis.lambda$setAnalyzer$2(ImageAnalysis.java:476)
                             	at androidx.camera.core.ImageAnalysis$$ExternalSyntheticLambda0.analyze(D8$$SyntheticClass)
                             	at androidx.camera.core.ImageAnalysisAbstractAnalyzer.lambda$analyzeImage$0$androidx-camera-core-ImageAnalysisAbstractAnalyzer(ImageAnalysisAbstractAnalyzer.java:283)
                             	at androidx.camera.core.ImageAnalysisAbstractAnalyzer$$ExternalSyntheticLambda1.run(D8$$SyntheticClass)
                             	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
                             	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
                             	at java.lang.Thread.run(Thread.java:761)

How can I successfully integrate one of these two models? I have searched for these exceptions but have not yet found a workable solution. I have also not found a sample project written in Kotlin using a model converted from YOLOv5 to TensorFlow Lite.

Best regards.

Hi,

First error:
java.lang.IllegalArgumentException: Label number 1 mismatch the shape on axis 1
This shows that the support library that loads the Bitmap expects different shape. This is a project with TensorFlow Lite and TensorFlow Support Library.

Second error:
java.lang.RuntimeException: Error occurred when initializing ObjectDetector: Input tensor has type kTfLiteFloat32: it requires specifying NormalizationOptions metadata to preprocess input images.
This shows that it is a project with Task Library usage. This library needs a .tflite file with metadata. If you see at the code snippet AS is generating when MobileNetV1.tflite file iis used then u can see that it has
Min/Max 0/255 but for yours those parameters are empty.
Solution will be to ask your ML team to inert metadata to your tflite file. Check how this is done here

Third error:
Type mismatch: inferred type is Triple<RectF, String, TensorBuffer> but Triple<RectF, String, Int> was expected
this shows that you have to convert the input to a tensorbuffer at the Triple object.

Forth error is a double issue:
16:06:58.187 tflite E Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
specifically instructs to use an additional dependency
org.tensorflow:tensorflow-lite-select-tf-ops at build.gradle file. This will rpobably not solve everything as below says:
java.lang.IllegalStateException: Error occurred when initializing ObjectDetector: AllocateTensors() failed.
which shows that the task library cannot work probably because the tflite file has been created with a different version of TensorFlow than what the Task library supports. You can try different versions though checking here.

If I were you I would persist with the Task library and a tflite file with metadata (Error number 2). This is a high level API and works out of the box if you add the metadata. If you add those then that project will work immediately.

Regards

@Jim-Morrison

Did you manage to move forward?

Exception Solving Attempts

For the first exception, I changed the “outputAsCategoryList” section and now I can obtain a tensor output containing 42,000 float values from the model. However, this raw data hasn’t been useful for my task.

As for the second exception, I relayed your suggestion to our ML team, but they responded with “We could not do it.”

The third exception was related to a code issue where everything appeared correct while coding, but when I tried to build it, I encountered a “Type mismatch” error. Unfortunately, we haven’t made any progress on resolving this issue.

Regarding the fourth error, I have tried various versions of the “org.tensorflow:tensorflow-lite-select-tf-ops” library and the “tensorflow-lite-task-vision” library, but nothing has changed. I continue to experience both exceptions in the same manner for this issue.

Our new route to run models in Android

I have researched the Task Library during this process. I noticed the “Model compatibility requirements” section in the image below. When looking at these requirements, I realized that neither of our two models meets these requirements. Does the model we talked about in Exception 2 meet these requirements? I thought it wouldn’t work when I applied your suggestion because when I look at the requirements I see that the output must contain 4 outputs of type kTfLiteFloat32. The same is true for the model we talked about in Exception 4. It also has 4 outputs but one of the outputs is of type Int32. Is it really possible to use these two models with Task Library?

After my research, I came across the “TensorFlow Lite Model Maker” and “Supported object detector models” sections in the document, and I mentioned what was described in these sections to the team yesterday. They provided me with a model created using TensorFlow Lite Model Maker, and this model is working, but it is performing very poorly. It is uncertain whether it is achieving the desired inference. We are working on it, but do you think this is normal? I wanted to obtain the output of this model that trained on TensorFlow Lite Model Maker to respond to this dialogue but this new model does not work as we want, it does not solve our work.

Best regards.

1 Like

@Jim-Morrison

I understand that you have a lot of issues with the models. The only thing you can do now is upload somewhere your models and/or code I can take a look…if that is possible and not under NDA. I can help you with the metadata also.

Regards