Functional and object-oriented models are not equivalent

I don’t quite understand why these two models defined with functional API:

inp = layers.Input((10,2))
x = layers.Flatten()(inp)
x = layers.Dense(5)(x)
m1 = keras.models.Model(inputs=inp, outputs=x)

and OO way:

class MyModel(tf.keras.Model):
    def __init__(self, inp_shape, out_size = 5):
        super(MyModel, self).__init__()
        self.inp = layers.InputLayer(input_shape=inp_shape)
        self.flatten = layers.Flatten()
        self.dense = layers.Dense(out_size)

    def call(self, a):
        x = self.inp(a)
        x = self.flatten(x)
        x = self.dense(x)
        return x
m2 = MyModel((10,2))
m2.build(input_shape = (10,2))

give different results:

> m1.summary()
Model: "functional_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_17 (InputLayer)        [(None, 10, 2)]           0         
_________________________________________________________________
flatten_19 (Flatten)         (None, 20)                0         
_________________________________________________________________
dense_18 (Dense)             (None, 5)                 105       
=================================================================
Total params: 105
Trainable params: 105
Non-trainable params: 0

> m2.summary()
Model: "my_model_9"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_18 (InputLayer)        [(None, 10, 2)]           0         
_________________________________________________________________
flatten_20 (Flatten)         multiple                  0         
_________________________________________________________________
dense_19 (Dense)             multiple                  15        
=================================================================
Total params: 15
Trainable params: 15
Non-trainable params: 0

When I test it with on some toy tensor, I get:

> tsta = np.random.randn(3,10,2)
> m1(tsta) # correct
> m2(tsta)
InvalidArgumentError: Matrix size-incompatible: In[0]: [3,20], In[1]: [2,5] [Op:MatMul]

What I want to achieve is to have exactly the same model as m1 but with subclassed API.

You could try my example:

import tensorflow as tf

tsta = tf.random.uniform([3,10,2])
tf.random.set_seed(111111)
inp = tf.keras.layers.Input((10,2))
x = tf.keras.layers.Flatten()(inp)
x = tf.keras.layers.Dense(5)(x)
m1 = tf.keras.Model(inputs=inp, outputs=x)
m1.build(input_shape = (3, 10, 2))
m1.summary()
print(m1(tsta))

tf.random.set_seed(111111)
class MyModel(tf.keras.Model):
   def __init__(self, out_size = 5):
       super(MyModel, self).__init__()
       self.flatten = tf.keras.layers.Flatten()
       self.dense = tf.keras.layers.Dense(out_size)

   def call(self, x):
       x = self.flatten(x)
       x = self.dense(x)
       return x
m2 = MyModel()
m2.build(input_shape = (3, 10, 2))
m2.summary()
print(m2(tsta))

Thanks, and why the summary of m2 gives:

Model: "my_model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_4 (Flatten)          multiple                  0         
_________________________________________________________________
dense_4 (Dense)              multiple                  105       
=================================================================

so you see how there’s multiple in the output? The whole reason why I’m struggling with this is because I want to check the output sizes of my model objects (slightly more complicated than that), but what I’m getting so far is mulitple. Any remedy for this?

Subclass model doesn’t record any shape info about layers such as layer.output_shape.
So it prints multiple.