Does text preprocessing or cleaning required for Bert model or others one?

Can someone suggest me, is there any text preprocessing required for pre train model like Bert, T5 etc.

Preprocessing is not needed when using pre-trained language representation models like BERT. It uses all of the information in a sentence, punctuation and stop-words from a wide range of perspectives by leveraging a multi-head self attention mechanism. Thank you.

Thank you. We are using all these stuff’s in traditional programming. Transformer don’t need this.

how about i use bert to my custom dataset do i need to preprocess my data right? if so, can you provide me a steps and if there is video tutorial that would be good. i really need it because i dont know really on how to data preprocess my data because i will use bert model for my custom dataset for question and answer to evaluate a client.