Help with Loss Function Errors

Hi, I’m new to TensorFlow. While applying tutorial code from ’ Using tf.data for finer control’ section of the ‘Load and Preprocess data’ tutorial to my own data that have two categories, I obtained error messages apparently related to loss functions. The two loss functions below compile but produce errors during model fitting:

SparseCategoricalCrossentropy results in ‘InvalidArgumentError: Received a label value of 2 which is outside the valid range of [0, 2).’ This is the loss function used in the tutorial.

BinaryCrossentropy results in ‘ValueError: logits and labels must have the same shape ((None, 2) vs (None, 1))’. I tried this as an alternative.

I should also note that I was able to successfully fit a model to my data using the methods described in the first part of the tutorial. It’s only the approach used in the second part of the tutorial that raises errors.

Thanks in advance for any advice.

What are the values of your labels?

for image, label in val_ds.take(10):
print("Label: ", label.numpy())

Label: [1 1 1 1 1 1 1 1 1 2 2 1 1 1 1 1 1 2 1 1 2 1 2 1 1 1 2 1 1 1 1 2]
Label: [2 2 2 1 2 2 1 2 2 1 2 1 1 2 1 2 2 2 2 2 1 2 1 2 1 2 1 1 2 1 1 1]
Label: [1 1 2 1 2 2 1 1 1 2 2 2 1 1 1 1 2 2 1 2 2 2 1 2 1 2 2 2 2 1 1 1]
Label: [1 1 1 1 1 1 2 2 2 2 1 2 2 1 2 1 1 1 1 1 2 2 1 2 1 1 1 2 2 2 2 1]
Label: [1 2 2 1 1 2 1 1 1 2 1 1 1 2 1 1 2 1 2 1 1 2 1 1 1 1 2 1 1 1 2 1]
Label: [2 2 1 1 1 1 1 2 1 2 1 2 1 2 2 2 2 2 1 1 1 1 1 1 1 1 2 1 1 1 1 1]
Label: [1 2 2 1 1 2 1 2 2 1 2 1 1 2 1 2 2 1 1 1 2 1 2 1 1 2 2 2 1 1 1 2]

Have you tried encoding with 0 1?

Kaggle creates an extra directory and this relates to the issue.

1 Like

As long as I manually remove the extra directory in the Kaggle working directory``os.remove(‘notebook_source.ipynb’)`, the encoding works as expected:

os.remove('__notebook_source__.ipynb')

class_names = np.array(sorted([item.name for item in data_dir.glob('*') if item.name != "LICENSE.txt"]))
print(class_names)

I’m using /kaggle/working instead of /kaggle/input because of a related issue which I will post about separately