Dropout, SavedModel and TFserving

I’ve trained a neural network with dropout and I saved it using the SavedModel format.
Now, I deployed it using tensorflow serving.

Everything works well but I’m wondering if the dropout is removed from the computation.
Is the SavedModel format removes the dropout, as model.forward(.., training=False) would do. I guess it should since the SavedModel format is meant for inference but just to be sure.

Thank you all !

I am also getting the same issue mentioned above.

I’ve done a small test to check if dropout is removed when exporting to saved model and infering with tensorflow serving.

The test is:

  1. create dummy model with arbitrary weights and dropout
  2. export to savedmodel
  3. deploy using tensorflow serving
  4. get inference multiple times and check if the output value changes.

Example:

Create dummy model, test it and save it

>> import tensorflow as tf
>> nn = tf.keras.Sequential([tf.keras.layers.Dense(1, kernel_initializer='ones'),   
                             tf.keras.layers.Dropout(0.25)])
>> a = tf.constant([[2.0]])
>> nn(a, training=False)
<tf.Tensor: shape=(1, 1), dtype=float32, numpy=array([[2.]], dtype=float32)>
>> tf.saved_model.save(nn, "test_dummy")

Launch tensorflow serving (documentation here)

Get inferencefrom tf/serving

$ curl -d '{"instances": [[2.0]]}' -X POST http://localhost:8501/v1/models/dummy:predict
{
    "predictions": [[2.0]
    ]
}

I did dozens of inference and never had a different value from 2.0 which the expected output when having no dropout.

Conclusion: I do think dropout (might be extended to batch normalization) are removed from savedmodel (or at least removed in tensorflow serving).

Tell me if you think I missed something :slight_smile:
Benoît.