The gradient in the update_step() function in Keras

In tf.Keras , the function update_step() is used by an optimizer to update the variables at every iteration given the variable and the gradient. I want to know if this gradient is the full gradient of all the samples in the batch or the gradient for a sample in the batch?
thank you

@dali_dali,

When using the model.fit() or the model.train_on_batch() the optimizer’s update_step() is called after the forward pass and the computation of the loss for a batch of data.

The gradients passed to the update_step() function are the full gradients of all the samples in the batch (i.e batch gradient).

Thank you!