Dropout in Tensorflow Hub Problem

Recently, I try to use bert to do some contrastive learning in tfhub. You know that dropout rate is a important param in contrastive learning (for nlp task), but I don’t find any method to set dropout rate in tfhub api. Is there some way to set dropout in tfhub?


Hi Ryan,

I don’t think you can change dropout as a parameter from the models on Hub.

What you could do instead, is create a model yourself (eg: Classify text with BERT  |  Text  |  TensorFlow) based on the BERT encoders on TFHub
and define the dropout rate you want. With this you have full control of the parameter.
does that work for you?

Thanks for your reply.
But in Contrastive Learning I must set dropout of the pretrained part. The tutorial Classify text with BERT | Text | TensorFlow can’t help me. Is there any solution or new features for Contrastive Learning in TFHub? (Reference: https://arxiv.org/abs/2104.08821)

1 Like

Sorry Ryan, I don’t know if for text, only image (TensorFlow Hub)

1 Like

I think e.g. in Albert V2 was removed:

1 Like