Uncertainty-guided semi-supervised few-shot class-incremental learning with knowledge distillation
Cui, Yawen; Deng, Wanxia; Xu, Xin; Liu, Zhen; Liu, Zhong; Pietikäinen, Matti; Liu, Li (2022-09-22)
Y. Cui et al., "Uncertainty-Guided Semi-Supervised Few-Shot Class-Incremental Learning With Knowledge Distillation," in IEEE Transactions on Multimedia, vol. 25, pp. 6422-6435, 2023, doi: 10.1109/TMM.2022.3208743
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
https://urn.fi/URN:NBN:fi-fe2023040334591
Tiivistelmä
Abstract
Class-Incremental Learning (CIL) aims at incrementally learning novel classes without forgetting old ones. This capability becomes more challenging when novel tasks contain one or a few labeled training samples, which leads to a more practical learning scenario, i.e., Few-Shot Class-Incremental Learning (FSCIL). The dilemma on FSCIL lies in serious overfitting and exacerbated catastrophic forgetting caused by the limited training data from novel classes. In this paper, excited by the easy accessibility of unlabeled data, we conduct a pioneering work and focus on a Semi-Supervised Few-Shot Class-Incremental Learning (Semi-FSCIL) problem, which requires the model incrementally to learn new classes from extremely limited labeled samples and a large number of unlabeled samples. To address this problem, a simple but efficient framework is first constructed based on the knowledge distillation technique to alleviate catastrophic forgetting. To efficiently mitigate the overfitting problem on novel categories with unlabeled data, uncertainty-guided semi-supervised learning is incorporated into this framework to select unlabeled samples into incremental learning sessions considering the model uncertainty. This process provides extra reliable supervision for the distillation process and contributes to better formulating the class means. Our extensive experiments on CIFAR100, miniImageNet and CUB200 datasets demonstrate the promising performance of our proposed method, and define baselines in this new research direction.
Kokoelmat
- Avoin saatavuus [32026]
Samankaltainen aineisto
Näytetään aineisto, joilla on samankaltaisia nimekkeitä, tekijöitä tai asiasanoja.
-
Linking learning behavior analytics and learning science concepts : designing a learning analytics dashboard for feedback to support learning regulation
Sedrakyan, Gayane; Malmberg, Jonna; Verbert, Katrien; Järvelä, Sanna; Kirschner, Paul A.
Computers in human behavior (Elsevier, 06.05.2018) -
Learning enablers, learning outcomes, learning paths, and their relationships in organizational learning and change
Haho, Päivi
Acta Universitatis Ouluensis. C, Technica : 479 (University of Oulu, 31.01.2014) -
Bridging learning sciences, machine learning and affective computing for understanding cognition and affect in collaborative learning
Järvelä, Sanna; Gašević, Dragan; Seppänen, Tapio; Pechenizkiy, Mykola; Kirschner, Paul A.
British journal of educational technology : 6 (John Wiley & Sons, 06.03.2020)