Error when using `load_model` on a BERT-MLM SavedModel

Hi, I pre-trained BERT using the official scripts on TPU and after some days I’ve downloaded the checkpoints and used this script to convert the checkpoints to the SavedModel format (using the --export_type model_with_mlm flag to export the MLM head alongside the encoder). I have also exported the preprocessor (with the same script).

The problem is when I try to load the preprocessor and the model using the tf.keras.models.load_model() function. The preprocessor loads correctly, but when I try to load the BERT model, I get the following error:


KeyError Traceback (most recent call last)
----> 1 mlm = tf.keras.models.load_model(BERT_MLM_PATH)

1 frames
/usr/local/lib/python3.8/dist-packages/keras/utils/ in error_handler(*args, **kwargs)
68 # To get the full stack trace, call:
69 # tf.debugging.disable_traceback_filtering()
—> 70 raise e.with_traceback(filtered_tb) from None
71 finally:
72 del filtered_tb

/usr/local/lib/python3.8/dist-packages/keras/saving/legacy/saved_model/ in _revive_graph_network(self, identifier, metadata, node_id)
559 else:
560 model = models_lib.Functional(
→ 561 inputs=[], outputs=[], name=config[“name”]
562 )

KeyError: ‘name’

Hi @Iulian277 ,

Is it possible for you to share a simple standalone code to reproduce the issue? There were lot of improvements in the model saving and loading. Could you also please share the Tensorflow version used here. The issue is resolved in 2.8 onwards.

Please note that if you have TF model, then it is suggested to use TF save/load approaches using and tf.saved_model.load . Similarly, if you have Keras model, then use and tf.keras.models.load_model`.