Semi-supervised learning with PAWS (🐾)

PAWS introduces a way to combine a small fraction of labeled samples with unlabeled ones during the pre-training of vision models. With its simple and unique approach, it sets SOTA in semi-supervised learning that too with far less compute and parameters.

Here’s my implementation of PAWS in TensorFlow:

For the benefit of the community, I have included all the major bits that have been used in order to make PAWS work. These recipes are largely applicable to train self-supervised and semi-supervised models at scale:

  • Multi-crop augmentation policy (helps a network systematically learn local to global mappings)
  • Class stratified sampling
  • WarmUpCosine LR schedule
  • Training with the LARS optimizer (with the correct hyperparameter choices)

Additionally, I have included a Colab Notebook that walks through the Multi-crop augmentation method since it can seem daunting when you work it out for the first time.

The results are pretty promising. I encourage you, folks, to check it out.

7 Likes

Massive share, thank you Sayak

Loved the PAWS paper

2 Likes