BatchNormalization Weights

Each unit in the layer has 4 parameters, beta, gamma, mean and standard deviation. This is documented.

get_weights is not documented for BatchNormalization, but returns a list of 4 arrays, each with a size corresponding to the number of units. Presumably, .get_weights()[i][j] will return the ith parameters for the jth unit. But which parameter is which?

This was asked on StackOverflow:

The answer seems to be that they are in the order I listed them, as given by the stackoverflow answer. Looking at the keras code, add_weight is being called in this order.

Looking at my network, my get_weights()[1] frequently has negative values. A negative gamma (scaling factor) makes no sense to me. Am I wrong about the order of parameters? Is there a good reason for a negative gamma? Is this a bug in Keras? Can the documentation be updated to include more information about this?

Just found out I can call:

.gamma[ix].numpy()
.beta[ix].numpy()
.moving_mean[ix].numpy()
.moving_variance[ix].numpy()

to get my weights. It appears that is also the order they come back in, not the order I listed them previously, explaining why my gamma is not negative (a negative beta makes sense).

This might answer all my questions, although documenting some of this would be greatly appreciated.