[Code] [arXiv] [Models] [BibTeX] [Follow-up work (MIM-Refiner)]
Masked AutoEncoder: Contrastive Tuning tunes the representation of a pre-trained MAE to form semantic clusters via a NNCLR training stage.
[Code] [arXiv] [Models] [BibTeX] [Follow-up work (MIM-Refiner)]
Masked AutoEncoder: Contrastive Tuning tunes the representation of a pre-trained MAE to form semantic clusters via a NNCLR training stage.