Using tflite_runtime on raspberry pi 4

I have trained my own model, saved it in the .tflite format on my macbook and tested it for inference on the same machine using tensorflow and keras. It worked fine.

However when I port the model on raspberry pi and try to do the inference using tflite_runtime with the following code:

interpreter = Interpreter(model_path=TF_MODEL_FILE_PATH)
signatures = interpreter.get_signature_list()
classify_lite = interpreter.get_signature_runner('serving_default')
interpreter.allocate_tensors()

I am getting the following error:

RuntimeError: There is at least 1 reference to internal data
      in the interpreter in the form of a numpy array or slice. Be sure to
      only hold the function returned from tensor() if you are using raw
      data access.

Could you please advice how to get around this problem?

@Telkitty,

Welcome to the Tensorflow Forum!

This error message usually occurs when you are trying to use a TensorFlow tensor in a way that is not allowed. TensorFlow tensors are designed to be used with the TensorFlow API and are not meant to be used as a numpy array or slice. Here are a few suggestions that may help you fix the error:

  • Make sure that you are only using the tensor returned by the tensor() function and not trying to access the internal data directly.
  • If you need to use the data in the tensor as a numpy array, you can use the numpy() method to convert the tensor to a numpy array.
  • If you need to slice the tensor, you can use the tf.slice() function to slice the tensor.

I hope these suggestions help! Please provide a toy model for further assistance.

Thank you!