I’ve been following along some tutorials in training a custom object detection model using Tensorflow 2.x Object Detection API.
Everything seems to work up until when I try exporting the trained inference graph. Basically, in TensorFlow 1.x, there is a script master/research/object_detection/export_inference_graph.py which is used to export the trained model checkpoints to a single frozen inference graph.
In TensorFlow 2.x, this script no longer works and instead, we use master/researchobject_detection/exporter_main_v2.py which outputs a SavedModel directory and some other stuff, but not the frozen inference graph. This is because in TF 2.x, frozen models have been deprecated.
I want to be able to retrieve the frozen inference graph from TensorFlow 1, in TensorFlow 2. I tried looking at this post Lei Mao's Log Book – Save, Load and Inference From TensorFlow 2.x Frozen Graph but I was encountering a “_UserObject has no attribute ‘inputs’” error.
Does anyone know how I can work around this error, or if there are any other solutions to export an object detection SavedModel into a single frozen inference graph?