Tutorial on Cyclical LR

It’s been a while TensorFlow Addons released an easy-to-use wrapper for Cyclical Learning Rates. Today, we are pleased to release its accompanying tutorial guide:

Usage is straightforward:

steps_per_epoch = len(x_train) // BATCH_SIZE
clr = tfa.optimizers.CyclicalLearningRate(initial_learning_rate=INIT_LR,
    maximal_learning_rate=MAX_LR,
    scale_fn=lambda x: 1/(2.**(x-1)),
    step_size=2 * steps_per_epoch
)
optimizer = tf.keras.optimizers.SGD(clr)
3 Likes