Dynamically create a network with subclassing

Hello, I am trying to find out what is the best way to dynamically define a network. When I say dynamically define I mean passing arguments like a list of layers/kernel_size etc preferrably with a config file. I would like to create a class that receives the network configuration as parameters and then takes care of creating individual layers.

I have seen several tutorials and blogs from the tensorflow docs, stack and blogs but haven’t found exactly what I am looking for (this was promising but there is still a lot of explicit writing in terms of layers How to create deep and dynamic custom Tensorflow models ? (Part-2) | Just AI Stuff). At this point i have the code below:
These are my parameters:

filters_per_layer_list =[128, 128, 64, 64, 32]
kernel_size_per_layer_list=[4,4,4,4,4]
num_dense=64
in_shape = (4000,1)

Network code (unfinished):

class Encoder(keras.Sequential):
    
    def __init__(self, inpt_shape, num_conv_layers, filters, kernel_sz_list, num_dense):
        super(Encoder, self).__init__()
        self.in_shape = inpt_shape
        self.filters_list = filters
        self.kernels_list = kernel_sz_list
        self.dense_nodes = num_dense
        self.num_conv_layers = num_conv_layers
        # self.lr = lr

    def build_model(self):
        self.add(layers.Input(shape=self.in_shape))
        for layer in range(self.num_conv_layers):
            model.add(layers.Conv1D(filters=self.filters_list[layer],
                            kernel_size=self.kernels_list[layer],
                            activation='relu',
                            strides=1, 
                            padding='same'))
            model.add(layers.MaxPooling1D(pool_size=2))
        model.add(layers.Flatten())
        model.add(layers.Dense(64, activation='relu'))
        model.summary()

I am subclassing the Sequential class unintuitive as it may seem as I want to use the native train function and callbacks.
However this code does not work. The error I get is that I need to set an input layer (which is there…). This ought to be the encoder part of an autoencoder network. I have already tried this by creating sequential encoder and decoder modules and then creating a class AE inheriting from Model.

Preferrably I 'd like to avoid explicitly writing each layer so that i can create multiple experiments just by passing different cofigurations to the same script and also leverage the compile/build/train/callback functions of sequential. I am coming from the pytorch paradigm so my approach might be completely wrong. Feel free to suggest a way to achieve the above. Thank you in advance.