Loss name not using names from the output dictionary

I am not sure if this is expected behavior or not, but when I have custom layer that returns dict of tensors then I would like to loss use those names because order could change if outputs change.

For example if I set my_layer_1_mse as a monitor variable in early stopping and change other outputs this will no longer work if out2 changes from my_layer_1_mse to e.g. my_layer_2_mse

I would like to know if it is possible to have loss and metrics be named out1_loss and out2_mse in this specific case. Or at least my_layer_out1_loss and my_layer_out2_loss

Another problem is that I would like for model outputs to follow same naming convention if possible when exporting model to, for example, to tfjs. Currently outputs will be named my_layer and my_layer_1 instead of out1 and out2 respectively.

Thank you for your help.

import tensorflow as tf


class Layer(tf.keras.layers.Layer):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)

    def call(self, inputs):
        batch_size = tf.shape(inputs)[0]
        return {"out1": tf.repeat(1, batch_size), "out2": tf.repeat(2, batch_size)}


def build_model():

    input = tf.keras.layers.Input(shape=(1,))
    out = Layer(name="my_layer")(input)

    return tf.keras.Model(
        inputs=input,
        outputs=out,
    )


model = build_model()

model.compile(
    optimizer=tf.keras.optimizers.Adam(),
    loss={"out1": "mse"},
    metrics={"out2": "mse"},
)

data = (
    # x
    [5, 5, 5, 5, 5, 5, 5, 5],
    # y
    {
        "out1": [5, 5, 5, 5, 5, 5, 5, 5],
        "out2": [5, 5, 5, 5, 5, 5, 5, 5],
    },
)


ds = tf.data.Dataset.from_tensor_slices(data)
ds = ds.batch(2)

model.fit(ds)
4/4 [==============================] - 0s 2ms/step - loss: 16.0000 - my_layer_loss: 16.0000 - my_layer_1_mse: 9.0000