Tflite model output details

When converting the keras model to tflite the converter infers the output shapes and signatures wrongly. Is there a way I can set the output shapes manually?

FYI it is a segmentation model.

This is what I am getting:

[{‘name’: ‘Identity’, ‘index’: 222, ‘shape’: array([1, 1, 1, 1], dtype=int32), ‘shape_signature’: array([-1, -1, -1, 1], dtype=int32), ‘dtype’: <class ‘numpy.float32’>, ‘quantization’: (0.0, 0), ‘quantization_parameters’: {‘scales’: array([], dtype=float32), ‘zero_points’: array([], dtype=int32), ‘quantized_dimension’: 0}, ‘sparsity_parameters’: {}}]

But I am expecting this:

[{‘name’: ‘Identity’, ‘index’: 103, ‘shape’: array([ 1, 256, 256, 1], dtype=int32), ‘shape_signature’: array([ -1, 256, 256, 1], dtype=int32), ‘dtype’: <class ‘numpy.float32’>, ‘quantization’: (0.0, 0), ‘quantization_parameters’: {‘scales’: array([], dtype=float32), ‘zero_points’: array([], dtype=int32), ‘quantized_dimension’: 0}, ‘sparsity_parameters’: {}}]

The model summary:
Layer (type) Output Shape Param # Connected to

input_2 (InputLayer) [(None, 256, 256, 3) 0


batch_normalization_32 (BatchNo (None, 256, 256, 3) 12 input_2[0][0]


activation_35 (Activation) (None, 256, 256, 3) 0 batch_normalization_32[0][0]


conv2d_36 (Conv2D) (None, 125, 125, 64) 9472 activation_35[0][0]


max_pooling2d_1 (MaxPooling2D) (None, 63, 63, 64) 0 conv2d_36[0][0]


batch_normalization_33 (BatchNo (None, 63, 63, 64) 256 max_pooling2d_1[0][0]


activation_36 (Activation) (None, 63, 63, 64) 0 batch_normalization_33[0][0]


conv2d_37 (Conv2D) (None, 32, 32, 64) 36928 activation_36[0][0]


batch_normalization_34 (BatchNo (None, 32, 32, 64) 256 conv2d_37[0][0]


activation_37 (Activation) (None, 32, 32, 64) 0 batch_normalization_34[0][0]


conv2d_39 (Conv2D) (None, 32, 32, 64) 4160 max_pooling2d_1[0][0]


conv2d_38 (Conv2D) (None, 32, 32, 64) 36928 activation_37[0][0]


add_11 (Add) (None, 32, 32, 64) 0 conv2d_39[0][0]
conv2d_38[0][0]


batch_normalization_35 (BatchNo (None, 32, 32, 64) 256 add_11[0][0]


activation_38 (Activation) (None, 32, 32, 64) 0 batch_normalization_35[0][0]


conv2d_40 (Conv2D) (None, 32, 32, 64) 36928 activation_38[0][0]


batch_normalization_36 (BatchNo (None, 32, 32, 64) 256 conv2d_40[0][0]


activation_39 (Activation) (None, 32, 32, 64) 0 batch_normalization_36[0][0]


conv2d_41 (Conv2D) (None, 32, 32, 64) 36928 activation_39[0][0]


add_12 (Add) (None, 32, 32, 64) 0 add_11[0][0]
conv2d_41[0][0]


batch_normalization_37 (BatchNo (None, 32, 32, 64) 256 add_12[0][0]


activation_40 (Activation) (None, 32, 32, 64) 0 batch_normalization_37[0][0]


conv2d_42 (Conv2D) (None, 16, 16, 128) 73856 activation_40[0][0]


batch_normalization_38 (BatchNo (None, 16, 16, 128) 512 conv2d_42[0][0]


activation_41 (Activation) (None, 16, 16, 128) 0 batch_normalization_38[0][0]


conv2d_44 (Conv2D) (None, 16, 16, 128) 8320 add_12[0][0]


conv2d_43 (Conv2D) (None, 16, 16, 128) 147584 activation_41[0][0]


add_13 (Add) (None, 16, 16, 128) 0 conv2d_44[0][0]
conv2d_43[0][0]


batch_normalization_39 (BatchNo (None, 16, 16, 128) 512 add_13[0][0]


activation_42 (Activation) (None, 16, 16, 128) 0 batch_normalization_39[0][0]


conv2d_45 (Conv2D) (None, 16, 16, 128) 147584 activation_42[0][0]


batch_normalization_40 (BatchNo (None, 16, 16, 128) 512 conv2d_45[0][0]


activation_43 (Activation) (None, 16, 16, 128) 0 batch_normalization_40[0][0]


conv2d_46 (Conv2D) (None, 16, 16, 128) 147584 activation_43[0][0]


add_14 (Add) (None, 16, 16, 128) 0 add_13[0][0]
conv2d_46[0][0]


batch_normalization_41 (BatchNo (None, 16, 16, 128) 512 add_14[0][0]


activation_44 (Activation) (None, 16, 16, 128) 0 batch_normalization_41[0][0]


conv2d_47 (Conv2D) (None, 8, 8, 256) 295168 activation_44[0][0]


batch_normalization_42 (BatchNo (None, 8, 8, 256) 1024 conv2d_47[0][0]


activation_45 (Activation) (None, 8, 8, 256) 0 batch_normalization_42[0][0]


conv2d_49 (Conv2D) (None, 8, 8, 256) 33024 add_14[0][0]


conv2d_48 (Conv2D) (None, 8, 8, 256) 590080 activation_45[0][0]


add_15 (Add) (None, 8, 8, 256) 0 conv2d_49[0][0]
conv2d_48[0][0]


batch_normalization_43 (BatchNo (None, 8, 8, 256) 1024 add_15[0][0]


activation_46 (Activation) (None, 8, 8, 256) 0 batch_normalization_43[0][0]


conv2d_50 (Conv2D) (None, 8, 8, 256) 590080 activation_46[0][0]


batch_normalization_44 (BatchNo (None, 8, 8, 256) 1024 conv2d_50[0][0]


activation_47 (Activation) (None, 8, 8, 256) 0 batch_normalization_44[0][0]


conv2d_51 (Conv2D) (None, 8, 8, 256) 590080 activation_47[0][0]


add_16 (Add) (None, 8, 8, 256) 0 add_15[0][0]
conv2d_51[0][0]


batch_normalization_45 (BatchNo (None, 8, 8, 256) 1024 add_16[0][0]


activation_48 (Activation) (None, 8, 8, 256) 0 batch_normalization_45[0][0]


conv2d_52 (Conv2D) (None, 4, 4, 512) 1180160 activation_48[0][0]


batch_normalization_46 (BatchNo (None, 4, 4, 512) 2048 conv2d_52[0][0]


activation_49 (Activation) (None, 4, 4, 512) 0 batch_normalization_46[0][0]


conv2d_54 (Conv2D) (None, 4, 4, 512) 131584 add_16[0][0]


conv2d_53 (Conv2D) (None, 4, 4, 512) 2359808 activation_49[0][0]


add_17 (Add) (None, 4, 4, 512) 0 conv2d_54[0][0]
conv2d_53[0][0]


batch_normalization_47 (BatchNo (None, 4, 4, 512) 2048 add_17[0][0]


activation_50 (Activation) (None, 4, 4, 512) 0 batch_normalization_47[0][0]


conv2d_55 (Conv2D) (None, 4, 4, 512) 2359808 activation_50[0][0]


batch_normalization_48 (BatchNo (None, 4, 4, 512) 2048 conv2d_55[0][0]


activation_51 (Activation) (None, 4, 4, 512) 0 batch_normalization_48[0][0]


conv2d_56 (Conv2D) (None, 4, 4, 512) 2359808 activation_51[0][0]


add_18 (Add) (None, 4, 4, 512) 0 add_17[0][0]
conv2d_56[0][0]


batch_normalization_49 (BatchNo (None, 4, 4, 512) 2048 add_18[0][0]


activation_52 (Activation) (None, 4, 4, 512) 0 batch_normalization_49[0][0]


conv2d_57 (Conv2D) (None, 4, 4, 128) 65664 activation_52[0][0]


up_sampling2d_6 (UpSampling2D) (None, 8, 8, 128) 0 conv2d_57[0][0]


batch_normalization_50 (BatchNo (None, 8, 8, 128) 512 up_sampling2d_6[0][0]


activation_53 (Activation) (None, 8, 8, 128) 0 batch_normalization_50[0][0]


conv2d_58 (Conv2D) (None, 8, 8, 128) 147584 activation_53[0][0]


batch_normalization_51 (BatchNo (None, 8, 8, 128) 512 conv2d_58[0][0]


activation_54 (Activation) (None, 8, 8, 128) 0 batch_normalization_51[0][0]


conv2d_59 (Conv2D) (None, 8, 8, 256) 33024 activation_54[0][0]


add_19 (Add) (None, 8, 8, 256) 0 conv2d_59[0][0]
add_16[0][0]


activation_55 (Activation) (None, 8, 8, 256) 0 add_19[0][0]


batch_normalization_52 (BatchNo (None, 8, 8, 256) 1024 activation_55[0][0]


activation_56 (Activation) (None, 8, 8, 256) 0 batch_normalization_52[0][0]


conv2d_60 (Conv2D) (None, 8, 8, 64) 16448 activation_56[0][0]


up_sampling2d_7 (UpSampling2D) (None, 16, 16, 64) 0 conv2d_60[0][0]


batch_normalization_53 (BatchNo (None, 16, 16, 64) 256 up_sampling2d_7[0][0]


activation_57 (Activation) (None, 16, 16, 64) 0 batch_normalization_53[0][0]


conv2d_61 (Conv2D) (None, 16, 16, 64) 36928 activation_57[0][0]


batch_normalization_54 (BatchNo (None, 16, 16, 64) 256 conv2d_61[0][0]


activation_58 (Activation) (None, 16, 16, 64) 0 batch_normalization_54[0][0]


conv2d_62 (Conv2D) (None, 16, 16, 128) 8320 activation_58[0][0]


add_20 (Add) (None, 16, 16, 128) 0 conv2d_62[0][0]
add_14[0][0]


activation_59 (Activation) (None, 16, 16, 128) 0 add_20[0][0]


batch_normalization_55 (BatchNo (None, 16, 16, 128) 512 activation_59[0][0]


activation_60 (Activation) (None, 16, 16, 128) 0 batch_normalization_55[0][0]


conv2d_63 (Conv2D) (None, 16, 16, 32) 4128 activation_60[0][0]


up_sampling2d_8 (UpSampling2D) (None, 32, 32, 32) 0 conv2d_63[0][0]


batch_normalization_56 (BatchNo (None, 32, 32, 32) 128 up_sampling2d_8[0][0]


activation_61 (Activation) (None, 32, 32, 32) 0 batch_normalization_56[0][0]


conv2d_64 (Conv2D) (None, 32, 32, 32) 9248 activation_61[0][0]


batch_normalization_57 (BatchNo (None, 32, 32, 32) 128 conv2d_64[0][0]


activation_62 (Activation) (None, 32, 32, 32) 0 batch_normalization_57[0][0]


conv2d_65 (Conv2D) (None, 32, 32, 64) 2112 activation_62[0][0]


add_21 (Add) (None, 32, 32, 64) 0 conv2d_65[0][0]
add_12[0][0]


activation_63 (Activation) (None, 32, 32, 64) 0 add_21[0][0]


batch_normalization_58 (BatchNo (None, 32, 32, 64) 256 activation_63[0][0]


activation_64 (Activation) (None, 32, 32, 64) 0 batch_normalization_58[0][0]


conv2d_66 (Conv2D) (None, 32, 32, 16) 1040 activation_64[0][0]


up_sampling2d_9 (UpSampling2D) (None, 64, 64, 16) 0 conv2d_66[0][0]


batch_normalization_59 (BatchNo (None, 64, 64, 16) 64 up_sampling2d_9[0][0]


activation_65 (Activation) (None, 64, 64, 16) 0 batch_normalization_59[0][0]


conv2d_67 (Conv2D) (None, 64, 64, 16) 2320 activation_65[0][0]


batch_normalization_60 (BatchNo (None, 64, 64, 16) 64 conv2d_67[0][0]


activation_66 (Activation) (None, 64, 64, 16) 0 batch_normalization_60[0][0]


conv2d_68 (Conv2D) (None, 64, 64, 64) 1088 activation_66[0][0]


up_sampling2d_10 (UpSampling2D) (None, 128, 128, 64) 0 conv2d_68[0][0]


batch_normalization_61 (BatchNo (None, 128, 128, 64) 256 up_sampling2d_10[0][0]


activation_67 (Activation) (None, 128, 128, 64) 0 batch_normalization_61[0][0]


conv2d_69 (Conv2D) (None, 128, 128, 32) 18464 activation_67[0][0]


batch_normalization_62 (BatchNo (None, 128, 128, 32) 128 conv2d_69[0][0]


activation_68 (Activation) (None, 128, 128, 32) 0 batch_normalization_62[0][0]


conv2d_70 (Conv2D) (None, 128, 128, 32) 9248 activation_68[0][0]


up_sampling2d_11 (UpSampling2D) (None, 256, 256, 32) 0 conv2d_70[0][0]


batch_normalization_63 (BatchNo (None, 256, 256, 32) 128 up_sampling2d_11[0][0]


activation_69 (Activation) (None, 256, 256, 32) 0 batch_normalization_63[0][0]


conv2d_71 (Conv2D) (None, 256, 256, 1) 129 activation_69[0][0]

Total params: 11,551,469
Trainable params: 11,541,543
Non-trainable params: 9,926


Hi @Evans_Kiplagat

I think you have to share with us a colab notebook to see the model and the conversion.
Can you do that?

Thanks

I loaded the model on colab to share and tried to convert which gave me a converter model with the correct output details. So I wondered why I wasn’t in my machine, then I checked my TensorFlow version which was 2.4.1 as compared to the colabs’ 2.6.0 and so I upgraded it and it works fine now.

1 Like