I trained the H5 model through the computer and passed tf.lite.tfliteconverter.from_ keras_ Mode to convert it into tflite model, but it seems that it can’t be used on Android devices. The reasoning result obtained through interpreter.run is Nan. Where did I make a configuration error?
The H5 model I trained can be used normally on the computer after being converted to tflite model, but it can't be used on Android devices. What's the problem
I am a little bit confused
Can you use the tflite model with the python api?
Yes, and it’s very fast. You can be sure it’s not h5model
In addition, the model of my android demo seems to be unint8, which needs to be connected with ByteBuffer, while the model trained by Python is float32. In addition, when I use tflite from Android demo in Python, it is extremely unsatisfactory. Even if the class type of the incoming value has been adjusted to uint8, the speed is extremely slow, and the result is not ideal
Can you upload somewhere the model to take a look?
It seems that the procedure inside android is not ideal but if you say that it works with Python api then we can find a solution.
OK, please wait a moment. By the way, can you tell me some good learning methods of tensorflow? Because I feel like I’ve been studying for a month, but there’s still no big breakthrough
This is the address of the compressed package of my demo：Google Drive: Sign-in
Hi @jun_yin , The Google Drive link shared by you asks for access.
Hello, can’t the connection published above be used? Is this OK? Because I seldom use Google disk
I’m not sure. I’ve just started learning tensorflow. I shouldn’t have made that progress yet
Check the examples TensorFlow Lite provides to see what suits your case best.
If you need any help ping me.