Custom Loss Function: "ValueError: No gradients provided for any variable"

I’m trying to implement a custom loss function, in this case one to minimize top 10 categorical accuracy of a TensorFlow Recommenders two-tower retrieval model.

Unfortunately, no matter what I try, I receive a “ValueError: No gradients provided for any variable” error when fitting my model.

See the loss function code below.

import tensorflow as tf

class Top10CategoricalAccuracy(tf.keras.losses.Loss):

  def __init__(
    self, 
    name='top10_categorical_accuracy'):

    super().__init__(name=name)

  def call(
    self,
    y_true,
    y_pred,
    sample_weight = None):

    # Convert the predictions to a categorical distribution.
    y_pred = tf.keras.backend.softmax(y_pred)

    # Compute the top 10 accuracy.
    top_10_accuracy = tf.keras.metrics.top_k_categorical_accuracy(y_true, y_pred, k=10)

    # Return the loss.
    return 1.0 - top_10_accuracy

The error disappears if I instead delegate to the built-in tf.keras.losses.CategoricalCrossentropy loss function:

    self.cce_loss_calculator = tf.keras.losses.CategoricalCrossentropy(from_logits = True);
    return self.cce_loss_calculator(y_true, y_pred, sample_weight)

How can I implement a custom loss function that doesn’t complain about “no gradients”?