Note that sample weighting is automatically supported for any such loss.

I have used sample weights before for segmentation problems like the one in the tensorflow segmentation guide. Nevertheless, I’ve always used them with losses like binary_crossentropy. Given that my experience is limited to losses that are already implemented in the tf API, I have the following questions:

How are sample weights applied to custom loss functions?

Binary crossentropy can be computed on every pixel of a 2D mask. Nevertheless, similarity metrics like dice score return one value per image. Taking into account the previous statement: is is possible to use 2D sample weights with loss functions that are based on similarity metrics (e.g. Dice Score)?

On top of that, if your custom loss function is based on the dice score, you can multiply the dice score by the sample weights before computing the mean. This will give more weight to samples that are more important for the task at hand.

With loss functions based on similarity metrics like dice score, it is possible to use 2D sample weights, where each pixel in the mask is given a weight value, and the sample weights are then used in the calculation of the dice score loss. These weights can be used to up or down-weight specific regions of the mask during the loss calculation, and the sample weights can be used to adjust each pixel’s contribution to the final loss value, allowing you to account for the relative importance of different regions of the image during the optimization process.