The importance of dropout layer in tensorflow

I want to understand what does dropout layer do
I read a lot about it but I can’t understand the effect of it to the data to prevent overfitting

import tensorflow as tf

inputs = np.arange(10).reshape(5, 2).astype(np.float32)
layer = tf.keras.layers.Dropout(0.2, input_shape=(2, ))
outputs = layer(inputs, training = true)
print(outputs)

when every run of the application, outputs become different

@Atef_Yasser,

Welcome to The Tensorflow Forum!

when every run of the application, outputs become different

Yes, Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time, which helps prevent overfitting.

For example, a given layer would normally have returned a vector [0.2, 0.5, 1.3, 0.8, 1.1] for a given input sample during training; after applying dropout, this vector will have a few zero entries distributed at random, e.g. [0, 0.5, 1.3, 0, 1.1] .

Please refer to this example to check dropout role at reducing overfitting.

Thank you!

Dropout is a great technique for preventing overfitting. I analyzed it and tried to explain it in a very clear way in this article.

I hope you find it usefult

1 Like