Lei Guo
Exemplar-supported representation for effective class-incremental learning.
Guo, Lei; Xie, Gang; Xu, Xinying; Ren, Jinchang
Abstract
Catastrophic forgetting is a key challenge for class-incremental learning with deep neural networks, where the performance decreases considerably while dealing with long sequences of new classes. To tackle this issue, in this paper, we propose a new exemplar-supported representation for incremental learning (ESRIL) approach that consists of three components. First, we use memory aware synapses (MAS) pre-trained on the ImageNet to retain the ability of robust representation learning and classification for old classes from the perspective of the model. Second, exemplar-based subspace clustering (ESC) is utilized to construct the exemplar set, which can keep the performance from various views of the data. Third, the nearest class multiple centroids (NCMC) is used as the classifier to save the training cost of the fully connected layer of MAS when the criterion is met. Intensive experiments and analyses are presented to show the influence of various backbone structures and the effectiveness of different components in our model. Experiments on several general-purpose and fine-grained image recognition datasets have fully demonstrated the efficacy of the proposed methodology.
Citation
GUO, L., XIE, G., XU, X. and REN, J. 2020. Exemplar-supported representation for effective class-incremental learning. IEEE access [online], 8, pages 51276-51284. Available from: https://doi.org/10.1109/ACCESS.2020.2980386
Journal Article Type | Article |
---|---|
Acceptance Date | Mar 9, 2020 |
Online Publication Date | Mar 12, 2020 |
Publication Date | Dec 31, 2020 |
Deposit Date | May 6, 2022 |
Publicly Available Date | Jun 7, 2022 |
Journal | IEEE Access |
Electronic ISSN | 2169-3536 |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Peer Reviewed | Peer Reviewed |
Volume | 8 |
Pages | 51276-51284 |
DOI | https://doi.org/10.1109/ACCESS.2020.2980386 |
Keywords | Task analysis; Image recognition; Machine learning; Synapses; Robustness; Training; Feature extractionnce; Exemplar-based subspace clustering; Incremental learning; Memory aware synapses |
Public URL | https://rgu-repository.worktribe.com/output/1085422 |
Files
GUO 2020 Exemplar-supported (VOR)
(6 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
Copyright Statement
© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
You might also like
Two-click based fast small object annotation in remote sensing images.
(2024)
Journal Article
Prompting-to-distill semantic knowledge for few-shot learning.
(2024)
Journal Article
Downloadable Citations
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search