Layer Initializers have unexpected behavior

I am using tensorflow-macos (v 2.9) and tensorflow-metal (v 0.5.1). I want to run experiments on copies of models with the same architecture but different initial weights. I was verifying that building my models would provide different starting weights by making a simple Dense layer and then running the get_weights() function to examine the weight values. By default, the Dense layer is initialized using the GlorotUniform initializer, and I leave the seed attribute at its default (None), which should mean that the initializer is non-deterministic and should provide a different random sequence each time the Dense layer is built. If I run the code to build the model several times, either in a loop, in multiple model build code lines, or by re-running the code multiple times, I also get the same values for the weights. I think it’s a problem with the initializers because if I setup code to just implement them to output an array of values I get this same behavior (whether or not I set the seed). Is this a TensorFlow problem, or a problem with the tensorflow-macos version? Thanks for any assistance.

Update: I think this is a problem with tensorflow-macos because if I run the same code in a Colab instance (which uses the non-macos tensorflow) the results are as expected: the initial weight values are different each time I run the code. I found a work-around by creating my own initializer classes that implement the GlorotUniform and HeUniform random generators and these work. Still wondering if this is a known bug and if anyone knows if it’s being worked on. Thanks.