How to modify the average pooling at the end of the model for specific number of classes?

I am using the resnet model from keras.
I can use

tf.keras.applications.resnet50.ResNet50(include_top=True, weights=None, input_tensor=None,
    input_shape=(224,224,3), classes=2 )

and my ourput will be two classes

predictions (Dense)             (None, 2)            4098        avg_pool[0][0]                   

is there anyway to not have the Dense layer and just have two class prediction via average pooling?
I thought this code should do the trick:


tf.keras.applications.resnet50.ResNet50(
    include_top=False, weights=None, input_tensor=None,
    input_shape=(224,224,3), pooling='avg' , classes=2 )

but my output is still 2048:

avg_pool (GlobalAveragePooling2 (None, 2048)         0           conv5_block3_out[0][0]           

Hi,

The output you get is the feature vector from resnet50.
You need now to add your classification head with the two classes you want.
To have a complete code, you can see it on this link: 사전 학습된 ConvNet을 이용한 전이 학습  |  TensorFlow Core

the main difference is that on the tutorial, the base model is MobileNetV2 instead of ResNet50, but it’s the same idea

1 Like

Thanks for your reply, I already saw the link, but it is not clear to me how should I add that to my model exactly since I am very new to tf keras.
so I have the model as defined above in the post.
now I define this new dense layer for having prediction of two classes:

prediction_layer = tf.keras.layers.Dense(2)

how can I add it to the model so when I print it it shows it in the model.summary()?

For clarification, I dont want to do it like:

inputs = tf.keras.Input(shape=(160, 160, 3))
x = data_augmentation(inputs)
x = preprocess_input(x)
x = base_model(x, training=False)
x = global_average_layer(x)
x = tf.keras.layers.Dropout(0.2)(x)
outputs = prediction_layer(x)
model = tf.keras.Model(inputs, outputs)

but I like to do it so my model.summary() show them all. would that be possible?

Why you don’t want to do the way you showed?
that’s one valid solution using the Function API
The summary will show your layer at the bottom as expected but the base model will be one line only, is that the problem?

you can still access the summary of the base model by using: model.get_layer(index=4).summary()

so the code that I am using right now dont have access to input in the same class that the model is defined. so I like to make the model finalize before calling in it in the function that call the input.
Plus I dont want to add the dense layer in the function that the input is available because it just makes the code not clean :slight_smile: like I rather to add every necessary parts to the model and just send it to the other function.
hmmmm, so there is no way to do it without using the input?

the main reason is because the code that I am using right now dont have access to input in the same class that the model is defined. so I like to make the model finalize before calling in it in the function that call the input.
Plus I dont want to add the dense layer in the function that the input is available because it just makes the code not clean :slight_smile: like I rather to add every necessary parts to the model and just send it to the other function.
hmmmm, so there is no way to do it without using the input?

Humm, it’s not clear to me your reason for the summary issue.

For example, if you use any feature vector from TensorFlow Hub (which is basically what your seem to be doing but using the keras application) you won’t be able to see the internal summary of the model too and that is a very common use.