Using TFlowLite models with ML_KIT for Flutter

Calculator::Open() for node “BoxClassifierCalculator” failed: #vk Type mismatch for input tensor image. Requested one of these types: kTfLiteUint8/kTfLiteFloat32, got INT8. [type.googleapis.com/mediapipe.StatusList=‘\n\xde\x01\x08\x03\x12\xac\x01\x43\x61lculator::Open() for node "BoxClassifierCalculator" failed: #vk Type mismatch for input tensor image. Requested one of these types: kTfLiteUint8/kTfLiteFloat32, got INT8.\x1a+\n$tflite::support::TfLiteSupportStatus\x12\x03\x33\x30\x32’]

com.google.mlkit.common.MlKitException: Failed to initialize detector. Type mismatch for input tensor image. Requested one of these types: kTfLiteUint8/kTfLiteFloat32, got INT8.


So i just placed the error above, now i will speak about my project:

  • I have a flutter app in which i want to be able to recognize people, cars and bikes at a certain distance, i tried to use some specific packages for flutter to get it done like tflite_flutter (tflite_flutter | Flutter package) but did not achieve many good results on making it work on video with it, so i am now using google ml_kit to achieve those results
  • I am able to receive info using default model or some models that comes by default into the app, but i was not able yet to make it work with any outside model
  • I tried to use a Yolov7-Quantized model, but i received this problem i sent above and the main issue on this is that i simply don’t know what i can do to achieve those specifications that the error is returning me, i feel a bit stupid like i am missing something right now, but no model i find on kaggle or any place else gives me any information about this, they all speak about Int8 and float16, but none of them are kTfliteUint8 or kTfliteFload32, only models i can make it work are the default ones coming with the package and they don’t supply my needs

If anybody can guide me with this specific error or even tho, helpe me using tflite_flutter package if you have experience with, because i also had a problem running models with them, i could not make it work to do live inference on models, i just want to know what i am getting so wrong now, it is a project for work so i cannot just give up on this