Node: 'gradient_tape/mean_squared_error/BroadcastGradientArgs' Incompatible shapes: [0,3] vs. [3,1]

Hi

I have created a generator that yields x,y where x is the shape of (640, 360,1) and y is (3,). The last layer in my model is Dense(3). When I fit the model using the generator I get an Incompatible shapes error when calculating the mean squared error. I tried to solve it by reshaping the y to (3,1) and adding a Reshape((3,1)) layer after the last layer in my model but didn’t solve anything.

The Error:

Node: 'gradient_tape/mean_squared_error/BroadcastGradientArgs'
Incompatible shapes: [0,3] vs. [3,1]

Extra information, the original generator:

def create_monkey_images(n):
    ... # uninteresting code 
    for i in range(n):
        rotation = ... # a int value
        center_x = ... # a int value 
        center_y =  ... # a int value

        img = ... # a gray scale  image with width 640 and  height 360.

        x = np.array(img).reshape([640, 360,1])

        y = np.array([center_x , center_y, rotation])
        
        yield x,y

Extra information, the original model:

model = Sequential()

model.add(Convolution2D(32, (3,3), padding='same', use_bias=False, input_shape=(96,96,1)))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())

model.add(Convolution2D(32, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Convolution2D(64, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())

model.add(Convolution2D(64, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Convolution2D(96, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())

model.add(Convolution2D(96, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Convolution2D(128, (3,3),padding='same', use_bias=False))
# model.add(BatchNormalization())
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())

model.add(Convolution2D(128, (3,3),padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Convolution2D(256, (3,3),padding='same',use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())

model.add(Convolution2D(256, (3,3),padding='same',use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())
model.add(MaxPool2D(pool_size=(2, 2)))

model.add(Convolution2D(512, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())

model.add(Convolution2D(512, (3,3), padding='same', use_bias=False))
model.add(LeakyReLU(alpha = 0.1))
model.add(BatchNormalization())


model.add(Flatten())
model.add(Dense(512,activation='relu'))
model.add(Dropout(0.1))
model.add(Dense(3))
model.summary()

model.compile(optimizer='adam', 
              loss='mean_squared_error',
              metrics=['mae'])

extra information, model.fit call parameters:

model.fit(create_monkey_images(2000),epochs = 50,batch_size = 256)

Hi @Asmail, Could you please confirm even after adding the reshape layer did you get the same error. Thank You.

Adding Reshape((3,1)) layer in the model without reshaping y to (3,1) in create_monkey_images fixed the error. Why will reshaping y to the same form as the Reshape layer result in an error? I don’t understand. The original shape of y is (3,).

Thanks for the help.