How gamma and beta get updated during backward process in batch normalization layer

Could you please let me know whether it is possible to check how the gamma and beta of the batch normalization layer get updated during the backpropagation process? I mean, can I print their values in each epoch?

How Does Batch Norm work? Batch Norm is just another network layer that gets inserted between a hidden layer and the next hidden layer. Its job is to take the outputs from the first hidden layer and normalize them before passing them on as the input of the next hidden layer.