Skip to main content

Research Repository

Advanced Search

Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models.

Raji, Ismail Damilola; Bello-Salau, Habeeb; Umoh, Ime Jarlath; Onumanyi, Adeiza James; Adegboye, Mutiu Adesina; Salawudeen, Ahmed Tijani

Authors

Ismail Damilola Raji

Habeeb Bello-Salau

Ime Jarlath Umoh

Adeiza James Onumanyi

Mutiu Adesina Adegboye

Ahmed Tijani Salawudeen



Abstract

Hyperparameter tuning is a critical function necessary for the effective deployment of most machine learning (ML) algorithms. It is used to find the optimal hyperparameter settings of an ML algorithm in order to improve its overall output performance. To this effect, several optimization strategies have been studied for fine-tuning the hyperparameters of many ML algorithms, especially in the absence of model-specific information. However, because most ML training pro-cedures need a significant amount of computational time and memory, it is frequently necessary to build an optimization technique that converges within a small number of fitness evaluations. As a result, a simple deterministic selection genetic algorithm (SDSGA) is proposed in this article. The SDSGA was realized by ensuring that both chromosomes and their accompanying fitness values in the original genetic algorithm are selected in an elitist-like way. We assessed the SDSGA over a variety of mathematical test functions. It was then used to optimize the hyperparameters of two well-known machine learning models, namely, the convolutional neural network (CNN) and the random forest (RF) algorithm, with application on the MNIST and UCI classification datasets. The SDSGA’s efficiency was compared to that of the Bayesian Optimization (BO) and three other popular metaheuristic optimization algorithms (MOAs), namely, the genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO) algorithms. The results obtained re-veal that the SDSGA performed better than the other MOAs in solving 11 of the 17 known benchmark functions considered in our study. While optimizing the hyperparameters of the two ML models, it performed marginally better in terms of accuracy than the other methods while taking less time to compute.

Citation

RAJI, I.D., BELLO-SALAU, H., UMOH, I.J., ONUMANYI, A.J., ADEGBOYE, M.A. and SALAWUDEEN, A.T. 2022. Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models. Applied sciences [online], 12(3), article 1186. Available from: https://doi.org/10.3390/app12031186

Journal Article Type Article
Acceptance Date Jan 20, 2022
Online Publication Date Jan 24, 2022
Publication Date Feb 1, 2022
Deposit Date Feb 3, 2022
Publicly Available Date Feb 3, 2022
Journal Applied Sciences
Electronic ISSN 2076-3417
Publisher MDPI
Peer Reviewed Peer Reviewed
Volume 12
Issue 3
Article Number 1186
DOI https://doi.org/10.3390/app12031186
Keywords Algorithm; Convolutional neural network; Hyperparameter; Random forest; Machine learning; Metaheuristic; Optimization
Public URL https://rgu-repository.worktribe.com/output/1580650

Files





You might also like



Downloadable Citations