Model Zoo models to TensorRT not possible?

I am trying to use a model from Tensorflow Model Zoo on a Jetson Nano computer. First I convert the model (.pb -file) to ONNX-format.
But when I try to convert it to TensorRT format after the ONNX-step) I only get an error saying:
[8] Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype)

I then run a script in order to turn the data type to float32 since uint8 is not supported by TensorRT. I then try to convert the model to TensorRT format once again, but now I get the error:
[8] Assertion failed: cond.is_weights() && cond.weights().count() == 1 && "If condition must be a initializer!"

And at this point I am stuck. It seems to me that Jetson/Nvidia does not actually support object detection using models from the Tensorflow Model Zoo?
At least I get the impression that most of the layers that the models in the Model Zoo are built on is not supported by TensorRT or ONNX, which therefor creates all sort of problems?

Note that I am using the tf.experimental.tensorrt.Converter but the actual tensorrt library provided by Nvidia in order to convert the model.
Thanks for any help!

1 Like

I meant that I am NOT using the tf.experimental.tensorrt.Converter in the post above.

1 Like