A Comparison of AutoML Hyperparameter Optimization Tools For Tabular Data
DOI:
https://doi.org/10.32473/flairs.36.133357Abstract
The performance of machine learning (ML) methods for classification and regression tasks applied to tabular datasets is sensitive to hyperparameters values. Therefore, finding the optimal values of these hyperparameters is integral in improving the prediction accuracy of an ML algorithm and the model selection. However, manually searching for the best configuration is a tedious task, and many AutoML (Automated Machine Learning) frameworks have been proposed recently to help practitioners solve this problem. Hyperparameters are the values or configurations that control the algorithm’s behavior while building the model. Hyperparameter optimization (HPO) is the guided process of finding the best combination of hyperparameters that delivers the best performance on the data and task at hand in a reasonable amount of time. In this work, we compare the performance of two frequently used AutoML HPO frameworks, Optuna and HyperOpt, on popular OpenML tabular datasets to identify the best framework for tabular data. The results of the experiments show that Optuna performs better than HyperOpt, whereas HyperOpt is the fastest for hyperparameter optimization.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Prativa Pokhrel, Alina Lazar
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.