Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers
DOI:
https://doi.org/10.32473/flairs.v35i.130549Palabras clave:
continual-learning, class-incremental learning, catastrophic forgettingResumen
In class-incremental learning, the objective is to learn a number of classes sequentially without having access to the whole training data. However, due to a problem known as catastrophic forgetting, neural networks suffer substantial performance drop in such settings. The problem is often approached by experience replay, a method that stores a limited number of samples to be replayed in future steps to reduce forgetting of the learned classes. When using a pretrained network as a feature extractor, we show that instead of training a single classifier incrementally, it is better to train a number of specialized classifiers which do not interfere with each other yet can cooperatively predict a single class. Our experiments on CIFAR100 dataset show that the proposed method improves the performance over SOTA by a large margin.
Descargas
Publicado
Cómo citar
Número
Sección
Licencia
Derechos de autor 2022 Sobirdzhon Bobiev, Albina Khusainova, Adil Khan, S.M. Ahsan Kazmi
Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial 4.0.