Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers

Autores/as

  • Sobirdzhon Bobiev Innopolis University
  • Albina Khusainova Innopolis University
  • Adil Khan Innopolis University
  • S.M. Ahsan Kazmi University of the West of England

DOI:

https://doi.org/10.32473/flairs.v35i.130549

Palabras clave:

continual-learning, class-incremental learning, catastrophic forgetting

Resumen

In class-incremental learning, the objective is to learn a number of classes sequentially without having access to the whole training data. However, due to a problem known as catastrophic forgetting, neural networks suffer substantial performance drop in such settings. The problem is often approached by experience replay, a method that stores a limited number of samples to be replayed in future steps to reduce forgetting of the learned classes. When using a pretrained network as a feature extractor, we show that instead of training a single classifier incrementally, it is better to train a number of specialized classifiers which do not interfere with each other yet can cooperatively predict a single class. Our experiments on CIFAR100 dataset show that the proposed method improves the performance over SOTA by a large margin.

Descargas

Publicado

2022-05-04

Cómo citar

Bobiev, S., Khusainova, A., Khan, A., & Ahsan Kazmi, S. (2022). Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers. The International FLAIRS Conference Proceedings, 35. https://doi.org/10.32473/flairs.v35i.130549

Número

Sección

Special Track: Neural Networks and Data Mining