How to train with negative examples

I finally made my custom object detection model and it works reasonably well. However, I have found some examples that are totally incorrect. Is there a way I can tell the TFLite model maker to ignore or treat certain background as of no value or negative detection?

I have googled for the topic and the only relevant suggestion was to label my negative example with no bounding boxes. The post, however, is very old and I am not sure if this approach will work in TFlite/ TF2.X?

Thanks,

@Alvaro_Tester,

Yes, you can consider images with no bounding boxes as negative samples. However, if you are facing a problem with 1.x code please refer to the migration guide which helps you to convert to 1.x to 2.x.

Thank you!