A Comparative Study of Continual, Lifelong, and Online Supervised Learning Libraries
DOI:
https://doi.org/10.32473/flairs.36.133171Keywords:
online learning, continual learning, lifelong learning, classificationAbstract
Machine learning has shown to be a crucial part of big data analytics; however, it lacks when the data is continuously streaming in from the system and changing too much from the original training data. Online learning is machine learning for streaming data that arrives in a sequential order where the model updates after every data point. While machine learning relies on well-established libraries such as PyTorch and Keras, the libraries for online learning are less well known, but they are here to serve similar purposes of reproducibility and reducing the time from research to production. Here, we compare different libraries for online learning research, specifically supervised learning. We compare them on the axes of developmental experience and benchmark testing as researchers. Our comparison as developers takes maintenance, documentation, and offerings of state-of-the-art algorithms into account. As this is not necessarily free of bias, we also use benchmarks known to online learning to gather power usage, RAM usage, speed, and accuracy of these libraries to get an objective view. Our findings show that Avalanche and River, including River-torch, are among the best libraries in terms of performance and applicability to the research in supervised online learning.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Logan, Brad Killen, Somayeh Bakhtiari Ramezani, Shahram Rahimi, Maria Seale, Sudip Mittal
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.