Facing issue while loading tensorflow lite model

While loading the tensorflow lite model facing below error:

“interpreter = tflite.Interpreter(model_path = ‘model.tflite’)”

Error:

ValueError Traceback (most recent call last)
Cell In[66], line 1
----> 1 interpreter = tflite.Interpreter(model_path = ‘model.tflite’)

File c:\Users\sfaldessai\AppData\Local\Programs\Python\Python38\lib\site-packages\tflite_runtime\interpreter.py:205, in Interpreter.init(self, model_path, model_content, experimental_delegates, num_threads)
198 custom_op_registerers_by_name = [
199 x for x in self._custom_op_registerers if isinstance(x, str)
200 ]
201 custom_op_registerers_by_func = [
202 x for x in self._custom_op_registerers if not isinstance(x, str)
203 ]
204 self._interpreter = (
→ 205 _interpreter_wrapper.CreateWrapperFromFile(
206 model_path, custom_op_registerers_by_name,
207 custom_op_registerers_by_func))
208 if not self._interpreter:
209 raise ValueError(‘Failed to open {}’.format(model_path))

ValueError: Unsupported data type 14 in tensor
Unsupported data type 14 in tensor
Unsupported data type 14 in tensor
Unsupported data type 14 in tensor

@Sonali_Faldesai,

Welcome to Tensorflow Forum!

What is the version of tflite you are using?

Thank you!

Hi Team,

Below are the details

tflite:
Name: tflite
Version: 2.10.0
Summary: Parsing TensorFlow Lite Models (*.tflite) Easily

tflite_running:
WARNING: Ignoring invalid distribution -ensorflow-intel (c:\users\sfaldessai\appdata\local\programs\python\python38\lib\site-packages)
Name: tflite-runtime
Version: 2.5.0
Summary: TensorFlow Lite is for mobile and embedded devices.

@Sonali_Faldesai,

Could you please try with the latest version of tensorflow and tflite-runtime?

Thank you!

Hi Team,

I was facing this issue while using it on windows as the latest version is 2.5.0, where as while i tried it on linux with 2.11.0 the issue is resolved.

But i am facing another issue as below, while trying to invoke the interpreter

File “/home/ubuntu/Documents/pythonProject1/tensorflow_lite.py”, line 48, in

  • interpreter.invoke()*
  • File “/home/ubuntu/Documents/pythonProject1/venv/lib/python3.8/site-packages/tflite_runtime/interpreter.py”, line 917, in invoke*
  • self._interpreter.Invoke()*
    RuntimeError: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding “org.tensorflow:tensorflow-lite-select-tf-ops” dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_selectNode number 5 (FlexTensorListReserve) failed to prepare.

Could you please help me with this issue.

Regards,
Sonali

Hello I am facing issues, when trying to load my tflite model,
It says valueerror: No subgraph in the model

@Sonali_Faldesai,@Gbenga_Raji,

Could you please share standalone code to reproduce the issue reported here?

Thank you!