Can graph mode change outer scope variables? - Possible Contradiction between Tensorflow tutorials

I see in the graph mode tutorial that all the outputs from the function decorated by @tf.function should be returned and directly modifying global variables is bad, but in the custom training tutorial the training accuracy metric object’s update state (train_acc_metric.update_state()), optimizer’s apply_gradients(optimizer.apply_gradients()), the loss function and even the model are neither being passed as a parameters nor being returned as mentioned in the other tutorial, but resulting in modification of values outside the decorated function. Can someone please explain why it is acceptable to change values in outer scope with direct access without returning, in this case?