Model outputting too many values

I’m doing a image classification excerise, with the following model:

    _input = keras.layers.Input(shape=(IMAGE_HEIGHT, IMAGE_WIDTH, 3))
    model = keras.models.Sequential([
        keras.layers.Conv2D(filters=64, kernel_size=(3,3), activation="relu"),
        keras.layers.Conv2D(filters=32, kernel_size=(3,3), activation="relu"),
        keras.layers.Dense(64, kernel_regularizer=keras.regularizers.l2(0.001), activation="relu"),
        keras.layers.Dense(32, kernel_regularizer=keras.regularizers.l2(0.001), activation="relu"),
        keras.layers.Dense(16, kernel_regularizer=keras.regularizers.l2(0.001), activation="relu"),
        keras.layers.Dense(2, activation="softmax"),

Where IMAGE_HEIGHT = 640 and IMAGE_WIDTH = 360

Because of the last layer, I’m expecting it to output only 2 values. However, when I call predict, it outputs:

[[9.6326274e-01 3.6737300e-02]
[9.9999464e-01 5.3239705e-06] 
[1.0000000e+00 1.4427736e-08]
[9.9398309e-01 6.0168877e-03]]      

It is predicting 2 labels, a 1 and a 0.

What exactly am I doing wrong?


It looks like the model is working correctly and predicting 2 labels with probabilities for each of the input images.

The first value in each row represents the probability of the image belonging to the first class (label 0) and the second value represents the probability of the image belonging to the second class (label 1).

To get a final prediction for each input image, you can simply choose the class with the highest probability as shown below

predictions = model.predict(input_data)

#Get the index of the class with the highest probability for each sample
predicted_classes = np.argmax(predictions, axis=1)

Thank you!

Ah, I didn’t know you could pass an axis to np argmax. Thank you! \

Just as a side note, why is it outputting 4 values?


You are passing 4 images for the model to predict. Hence, the model returns 4*2 metrics as output, which contains 4 rows that correspond to images and 2 columns for every image.

Please find the gist for reference.

Thank you!