Trying to create optimizer slot variable under the scope for tf which is different from the scope used for the original variable.distribute.Strategy

Hi,

I’m training a model, this is the general idea of the code:

with strategy.scope():
        # Create model
        model = My_Model(some parameters)
        model.build((some parameters))
        model.summary()
        
        opt = tf.keras.optimizers.Adam(learning_rate)
        opt = tfa.optimizers.MovingAverage(opt, average_decay=ema_decay)

        model.compile(
            optimizer=opt,
            loss=loss_dict,
            loss_weights=loss_weights,
            metrics=metrics_dict)

        output_list=[list of softmax layers for the output of model]
        model_b = tf.keras.Model(inputs=model.input, outputs=output_list)
        model_b.build((None, None, feats_dim, 1))
        model_b.compile(optimizer=opt)
model.fit(various parameters... callbacks=[cp_callback, logger_callback, tb_callback])

checkpoint_path=path to file saved by cp_callbacks
model.load_weights(checkpoint_path)

where:
cp_callback = tf.keras.callbacks.ModelCheckpoint(
filepath=checkpoint_path, verbose=1, save_weights_only=True, period=1)

It seems like the model can’t be loaded… I get the following error:

ValueError: Trying to create optimizer slot variable under the scope for tf.distribute.Strategy (<tensorflow.python.distribute.distribute_lib._DefaultDistributionStrategy object at 0x7f37e570feb8>), which is different from the scope used for the original variable (MirroredVariable:{

Any ideas how to fix this/what to check? Thanks!