Ismail Damilola Raji
Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models.
Raji, Ismail Damilola; Bello-Salau, Habeeb; Umoh, Ime Jarlath; Onumanyi, Adeiza James; Adegboye, Mutiu Adesina; Salawudeen, Ahmed Tijani
Authors
Habeeb Bello-Salau
Ime Jarlath Umoh
Adeiza James Onumanyi
Mutiu Adesina Adegboye
Ahmed Tijani Salawudeen
Abstract
Hyperparameter tuning is a critical function necessary for the effective deployment of most machine learning (ML) algorithms. It is used to find the optimal hyperparameter settings of an ML algorithm in order to improve its overall output performance. To this effect, several optimization strategies have been studied for fine-tuning the hyperparameters of many ML algorithms, especially in the absence of model-specific information. However, because most ML training pro-cedures need a significant amount of computational time and memory, it is frequently necessary to build an optimization technique that converges within a small number of fitness evaluations. As a result, a simple deterministic selection genetic algorithm (SDSGA) is proposed in this article. The SDSGA was realized by ensuring that both chromosomes and their accompanying fitness values in the original genetic algorithm are selected in an elitist-like way. We assessed the SDSGA over a variety of mathematical test functions. It was then used to optimize the hyperparameters of two well-known machine learning models, namely, the convolutional neural network (CNN) and the random forest (RF) algorithm, with application on the MNIST and UCI classification datasets. The SDSGA’s efficiency was compared to that of the Bayesian Optimization (BO) and three other popular metaheuristic optimization algorithms (MOAs), namely, the genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO) algorithms. The results obtained re-veal that the SDSGA performed better than the other MOAs in solving 11 of the 17 known benchmark functions considered in our study. While optimizing the hyperparameters of the two ML models, it performed marginally better in terms of accuracy than the other methods while taking less time to compute.
Citation
RAJI, I.D., BELLO-SALAU, H., UMOH, I.J., ONUMANYI, A.J., ADEGBOYE, M.A. and SALAWUDEEN, A.T. 2022. Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models. Applied sciences [online], 12(3), article 1186. Available from: https://doi.org/10.3390/app12031186
Journal Article Type | Article |
---|---|
Acceptance Date | Jan 20, 2022 |
Online Publication Date | Jan 24, 2022 |
Publication Date | Feb 1, 2022 |
Deposit Date | Feb 3, 2022 |
Publicly Available Date | Feb 3, 2022 |
Journal | Applied Sciences |
Electronic ISSN | 2076-3417 |
Publisher | MDPI |
Peer Reviewed | Peer Reviewed |
Volume | 12 |
Issue | 3 |
Article Number | 1186 |
DOI | https://doi.org/10.3390/app12031186 |
Keywords | Algorithm; Convolutional neural network; Hyperparameter; Random forest; Machine learning; Metaheuristic; Optimization |
Public URL | https://rgu-repository.worktribe.com/output/1580650 |
Files
RAJI 2022 Simple deterministic selection (VOR)
(1 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by-nc-nd/4.0/
Copyright Statement
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
You might also like
Machine learning methods for sign language recognition: a critical review and analysis.
(2021)
Journal Article
Numerical study of pipeline leak detection for gas-liquid stratified flow.
(2021)
Journal Article
Modeling and implementation of smart home and self-control window using FPGA and Petri Net.
(2020)
Presentation / Conference Contribution
Downloadable Citations
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search