Ismail Damilola Raji
Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models.
Raji, Ismail Damilola; Bello-Salau, Habeeb; Umoh, Ime Jarlath; Onumanyi, Adeiza James; Adegboye, Mutiu Adesina; Salawudeen, Ahmed Tijani
Ime Jarlath Umoh
Adeiza James Onumanyi
MUTIU ADEGBOYE email@example.com
Ahmed Tijani Salawudeen
Hyperparameter tuning is a critical function necessary for the effective deployment of most machine learning (ML) algorithms. It is used to find the optimal hyperparameter settings of an ML algorithm in order to improve its overall output performance. To this effect, several optimization strategies have been studied for fine-tuning the hyperparameters of many ML algorithms, especially in the absence of model-specific information. However, because most ML training pro-cedures need a significant amount of computational time and memory, it is frequently necessary to build an optimization technique that converges within a small number of fitness evaluations. As a result, a simple deterministic selection genetic algorithm (SDSGA) is proposed in this article. The SDSGA was realized by ensuring that both chromosomes and their accompanying fitness values in the original genetic algorithm are selected in an elitist-like way. We assessed the SDSGA over a variety of mathematical test functions. It was then used to optimize the hyperparameters of two well-known machine learning models, namely, the convolutional neural network (CNN) and the random forest (RF) algorithm, with application on the MNIST and UCI classification datasets. The SDSGA’s efficiency was compared to that of the Bayesian Optimization (BO) and three other popular metaheuristic optimization algorithms (MOAs), namely, the genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO) algorithms. The results obtained re-veal that the SDSGA performed better than the other MOAs in solving 11 of the 17 known benchmark functions considered in our study. While optimizing the hyperparameters of the two ML models, it performed marginally better in terms of accuracy than the other methods while taking less time to compute.
RAJI, I.D., BELLO-SALAU, H., UMOH, I.J., ONUMANYI, A.J., ADEGBOYE, M.A. and SALAWUDEEN, A.T. 2022. Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models. Applied sciences [online], 12(3), article 1186. Available from: https://doi.org/10.3390/app12031186
|Journal Article Type||Article|
|Acceptance Date||Jan 20, 2022|
|Online Publication Date||Jan 24, 2022|
|Publication Date||Feb 1, 2022|
|Deposit Date||Feb 3, 2022|
|Publicly Available Date||Feb 3, 2022|
|Peer Reviewed||Peer Reviewed|
|Keywords||Algorithm; Convolutional neural network; Hyperparameter; Random forest; Machine learning; Metaheuristic; Optimization|
RAJI 2022 Simple deterministic selection (VOR)
Publisher Licence URL
You might also like
Machine learning methods for sign language recognition: a critical review and analysis.
Numerical study of pipeline leak detection for gas-liquid stratified flow.
Modeling and implementation of smart home and self-control window using FPGA and Petri Net.