How to remove normalization from Keras?

Hello. I have an example of keras neural network and the example uses data normalization. I think it causes problems, because training set decreases over time and result are labels that are still in same value range. Can someone advise how to remove normalization in order to get the process working? The code is:

normalizer = tf.keras.layers.Normalization(axis=-1)

normalizer.adapt(np.array(train_features))

def build_and_compile_model(norm):
  model = keras.Sequential([
      norm,
      layers.Dense(neurons, activation='relu'),
      layers.Dense(neurons, activation='relu'),
      layers.Dense(neurons, activation='relu'),
      layers.Dense(1)
      ])
  model.compile(loss='mean_absolute_error', optimizer=tf.keras.optimizers.Adam(0.001))
  return model

dnn_model = build_and_compile_model(normalizer)

Thank you so much!

You can remove the normalization layer from your model as described below

    def build_and_compile_model(norm):
      model = keras.Sequential([
          layers.Dense(neurons, activation='relu'),
          layers.Dense(neurons, activation='relu'),
          layers.Dense(neurons, activation='relu'),
          layers.Dense(1)
          ])
      model.compile(loss='mean_absolute_error', optimizer=tf.keras.optimizers.Adam(0.001))
      return model

Thank you.