Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers

Authors

  • Sobirdzhon Bobiev Innopolis University
  • Albina Khusainova Innopolis University
  • Adil Khan Innopolis University
  • S.M. Ahsan Kazmi University of the West of England

DOI:

https://doi.org/10.32473/flairs.v35i.130549

Keywords:

continual-learning, class-incremental learning, catastrophic forgetting

Abstract

In class-incremental learning, the objective is to learn a number of classes sequentially without having access to the whole training data. However, due to a problem known as catastrophic forgetting, neural networks suffer substantial performance drop in such settings. The problem is often approached by experience replay, a method that stores a limited number of samples to be replayed in future steps to reduce forgetting of the learned classes. When using a pretrained network as a feature extractor, we show that instead of training a single classifier incrementally, it is better to train a number of specialized classifiers which do not interfere with each other yet can cooperatively predict a single class. Our experiments on CIFAR100 dataset show that the proposed method improves the performance over SOTA by a large margin.

Downloads

Published

04-05-2022

How to Cite

Bobiev, S., Khusainova, A., Khan, A., & Ahsan Kazmi, S. (2022). Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers. The International FLAIRS Conference Proceedings, 35. https://doi.org/10.32473/flairs.v35i.130549

Issue

Section

Special Track: Neural Networks and Data Mining