TFLIte android load model by network

Hello
i would like to load my tflite model from a specific folder and not from the asset folder.
The TFLiteObjectDetectionAPIModel.create function doesn’t allow to do this. What is the method to init the tflite model outside of the asset folder ?
thanks

Hi @gael_cobert

Give me a link of the project/documentation that you see this function not working as you want.

Thanks

path relative to the assets folder ==> i want to give a full path not relative to asset folder

TFLiteObjectDetectionAPIModel.create(
context,
TF_OD_API_MODEL_FILE,
TF_OD_API_LABELS_FILE,
TF_OD_API_INPUT_SIZE,
TF_OD_API_IS_QUANTIZED);

/**

  • Initializes a native TensorFlow session for classifying images.
  • {@code labelFilename}, {@code inputSize}, and {@code isQuantized}, are NOT required, but to

  • keep consistency with the implementation using the TFLite Interpreter Java API. See <a
  • href=“https://github.com/tensorflow/examples/blob/master/lite/examples/object_detection/android/lib_interpreter/src/main/java/org/tensorflow/lite/examples/detection/tflite/TFLiteObjectDetectionAPIModel.java”>lib_interpreter.
  • @param modelFilename The model file path relative to the assets folder
  • @param labelFilename The label file path relative to the assets folder
  • @param inputSize The size of image input
  • @param isQuantized Boolean representing model is quantized or not
    */

Thank you for the code snippet. Can you also put the link of the web page that this code snippet exists?

Best

In this file

it does not necessarily say that you have to stick with assets folder.
You can change the function of loadModelFile() and use a path of your own…let’s say from internals file dir.

Try and tell me how it goes.

Thanks

ok thanks for your response, i have to try now