Project Icon

SparK

Revolutionizing CNNs with BERT-Style Self-Supervised Learning

Product DescriptionSparK offers an innovative method for applying BERT-style self-supervised pretraining to all types of convolutional neural networks. Compatible with various CNN architectures such as ResNet, this approach minimizes dependencies and advances image classification capabilities. By employing sophisticated masked modeling, SparK-trained CNNs can surpass untrained larger models and challenge Swin-Transformer models. The pretraining shows significant scalability, enhancing all models involved. For detailed analysis and insights into the advantages of generative self-supervised pretraining, refer to our ICLR 2023 Spotlight paper. Additionally, our accessible Colab demos illustrate model reconstruction and conv layer masking issues.
Project Details