I am currently training a ResNet model with both batch normalization and dropout layers. My goal is to use monte carlo dropout for uncertainty estimation at evaluation time (i.e. with training=False in my model call).
I’m currently working with tf.functions and the SavedModel format (not eager execution). Eager execution is to slow for my application, and therefore is not an option for me.
Thus, when I set training=False for my model calls, it disables both the batch normalization and dropout layers. However, when I set training=False I want to keep dropout enabled but disable batch normalization (for the purpose of uncertainty estimation).
How can I achieve this with tf.function and the SavedModel format?
I am using tf-nightly.