I tried googling around, but there doesn’t seem to be any elegant ways to manually edit TF checkpoints. For TF1 checkpoints, I found a dirty hack, but for TF2 checkpoints, the dirty hack does not work.
It seems that TF2 checkpoints have an additional variable “_CHECKPOINTABLE_OBJECT_GRAPH” (which contains a byte string?). I tried saving a checkpoint without it, but when TF2 tries to load the checkpoint, I get a ton of “WARNING:tensorflow:Unresolved object in checkpoint: (root).chip_layer.layer_with_weights-0” errors (aside from the “root” part, the rest of the name is right). However, I can’t create/edit a variable by that name, and hence can’t save a new checkpoint with it. This doesn’t even preclude the possibility of the checkpoint containing other stuff that’s much harder to access.
I really do not want to use a hex editor…
I recently converted a large project from TF1 → TF2 with keras interface. However, doing so also drastically changed all the names of all the variables. I would like to continue using previously trained models, and not train from scratch. As a desperate workaround, the best I can come up with is to load the model in TF2, and manually load in the variables from the old checkpoint 1 var at a time.