Change Dimension of tflite model

hy, I have tflite mode which dimension is [1,3,224,224] i wanted to convert this into dimension [1,224,224,3] can one can provide me a code to do this

hi,

if your plan is to perform inference with your tflite model, you can convert the inputs to the input shape specification of your model, you can simply reshape the image to [1, 3, 224, 224] .

otherwise check this tf.lite.Interpreter  |  TensorFlow Core v2.8.0

1 Like

Hi @Shani_zee

I presume you have converted a pytorch model to tflite. As TensorFlow Lite Support Library can not currently support input images of your dimensions you have to do it the old way by providing the interpreter with a bytebuffer of the correct order.

Check this project GitHub - farmaker47/photos_with_depth
and a medium blog post also
Estimate depth in RGB images.. Written by George Soloupis ML GDE. | by George Soloupis | Medium

If you need more help tag me

Regards

how i reshap my model basically i convert mat file into onxx and then convert into the tflite file

I need code that convert my model into this dimension to run for the tflite anderiod example

Since you are already converting from .mat to .onnx, I’d suggest you go ahead and convert that .onnx into a TensorFlow frozen graph (.pb). Here’s the library suitable for that:

This will give you a model that accepts channels in the last dimension as per your specifications. Now, to convert the frozen graph into a TFLite model you can use tf.compat.v1.lite.TFLiteConverter.from_frozen_graph. Here’s an end-to-end example of using this:

1 Like

@thea is it possible to install a notebook preview plugin that renders JSONs like the above in a nicer format?

1 Like

Apologies for not seeing this earlier! I will investigate.

when did this issue will solve i have my models but i don’t have method to make my all models into TensorFlow model input size.