SavedModel: enable dropout & disable batch normalization

I am currently training a ResNet model with both batch normalization and dropout layers. My goal is to use monte carlo dropout for uncertainty estimation at evaluation time (i.e. with training=False in my model call).

I’m currently working with tf.functions and the SavedModel format (not eager execution). Eager execution is to slow for my application, and therefore is not an option for me.

Thus, when I set training=False for my model calls, it disables both the batch normalization and dropout layers. However, when I set training=False I want to keep dropout enabled but disable batch normalization (for the purpose of uncertainty estimation).

How can I achieve this with tf.function and the SavedModel format?

I am using tf-nightly.

Hi @Zander_Giuffrida ,

You can use a custom layer that inherits from tf.keras.layers.Layer and overrides the call method. In the call method, you can check the value of the training argument and set the behavior of the batch normalization and dropout layers accordingly.

I hope this helps you!

Thanks.