Does text preprocessing or cleaning required for Bert model or others one?

Can someone suggest me, is there any text preprocessing required for pre train model like Bert, T5 etc.

Preprocessing is not needed when using pre-trained language representation models like BERT. It uses all of the information in a sentence, punctuation and stop-words from a wide range of perspectives by leveraging a multi-head self attention mechanism. Thank you.

Thank you. We are using all these stuff’s in traditional programming. Transformer don’t need this.