I’m training multiple models from the same input data, and I want each of the models to use the same normalization layer:
normalizer = tf.keras.layers.Normalization(axis = None)
Can I use the same normalizer instance as a Keras layer in all three models without any transfer learning taking place? Or will training the first model create weights or “state” for the layer that will transfer to the subsequent models that share the same layer?
(The context here probably doesn’t matter, but I’m using TensorFlow recommenders, where I am training two retrieval models and a ranking model.)