TFLIte android load model by network

i would like to load my tflite model from a specific folder and not from the asset folder.
The TFLiteObjectDetectionAPIModel.create function doesn’t allow to do this. What is the method to init the tflite model outside of the asset folder ?

Hi @gael_cobert

Give me a link of the project/documentation that you see this function not working as you want.


path relative to the assets folder ==> i want to give a full path not relative to asset folder



  • Initializes a native TensorFlow session for classifying images.
  • {@code labelFilename}, {@code inputSize}, and {@code isQuantized}, are NOT required, but to

  • keep consistency with the implementation using the TFLite Interpreter Java API. See <a
  • href=“”>lib_interpreter.
  • @param modelFilename The model file path relative to the assets folder
  • @param labelFilename The label file path relative to the assets folder
  • @param inputSize The size of image input
  • @param isQuantized Boolean representing model is quantized or not

Thank you for the code snippet. Can you also put the link of the web page that this code snippet exists?


In this file

it does not necessarily say that you have to stick with assets folder.
You can change the function of loadModelFile() and use a path of your own…let’s say from internals file dir.

Try and tell me how it goes.


ok thanks for your response, i have to try now