A soft nearest-neighbor framework for continual semi-supervised learning - Apprentissage de modèles visuels à partir de données massives Access content directly
Preprints, Working Papers, ... Year : 2022

A soft nearest-neighbor framework for continual semi-supervised learning

Abstract

Despite significant advances, the performance of state-of-the-art continual learning approaches hinges on the unrealistic scenario of fully labeled data. In this paper, we tackle this challenge and propose an approach for continual semi-supervised learning---a setting where not all the data samples are labeled. An underlying issue in this scenario is the model forgetting representations of unlabeled data and overfitting the labeled ones. We leverage the power of nearest-neighbor classifiers to non-linearly partition the feature space and learn a strong representation for the current task, as well as distill relevant information from previous tasks. We perform a thorough experimental evaluation and show that our method outperforms all the existing approaches by large margins, setting a strong state of the art on the continual semi-supervised learning paradigm. For example, on CIFAR100 we surpass several others even when using at least 30 times less supervision (0.8% vs. 25% of annotations). The code is publicly available on https://github.com/kangzhiq/NNCSL
Fichier principal
Vignette du fichier
Paper_for_arXiv.pdf (842.33 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03893056 , version 1 (10-12-2022)
hal-03893056 , version 2 (05-04-2023)
hal-03893056 , version 3 (11-09-2023)

Identifiers

  • HAL Id : hal-03893056 , version 1

Cite

Zhiqi Kang, Enrico Fini, Moin Nabi, Elisa Ricci, Karteek Alahari. A soft nearest-neighbor framework for continual semi-supervised learning. 2022. ⟨hal-03893056v1⟩
146 View
68 Download

Share

Gmail Facebook X LinkedIn More