Reuse Normalization Layer in Multiple Models?

I’m training multiple models from the same input data, and I want each of the models to use the same normalization layer:

normalizer = tf.keras.layers.Normalization(axis = None)
normalizer.adapt(vocabulary)

Can I use the same normalizer instance as a Keras layer in all three models without any transfer learning taking place? Or will training the first model create weights or “state” for the layer that will transfer to the subsequent models that share the same layer?

(The context here probably doesn’t matter, but I’m using TensorFlow recommenders, where I am training two retrieval models and a ranking model.)

I think one can say Normalization is a preprocessing layer that has an internal state and is non-trainable*. The state of Normalization layers are set before training .

You provided hardly few elements on you model, but maybe you can add to each of you model a second Input layer that would be fed by a bucket of all the data you want to normalize, hence followed by a Normalization layer, and the rest of your model would be fed with data from the other Input layer? There are probably many ways to get what you want anyways.

*It can be “adapted” though