Multi Learning rate in keras

for keras platform in model.compile only we use single learning rate, but I need multi learning rate for my model. for example my model include, Backbone with learning rate of 10^-4 and one transformer with learning rate of 10^-3. How could I set this two learning rate inside the model.compile with Adam optimizer or any optimizer?

It seams that TensorFlow Addons has an optimizer with this capability: tfa.optimizers.MultiOptimizer  |  TensorFlow Addons

1 Like