Skip to main content

Research Repository

Advanced Search

A novel surrogate model for variable-length encoding and its application in optimising deep learning architecture.

Dang, Truong; Nguyen, Tien Thanh; McCall, John; Han, Kate; Liew, Alan Wee-Chung

Authors

Kate Han

Alan Wee-Chung Liew



Abstract

Deep neural networks (DNN) has achieved great successes across multiple domains. In recent years, a number of approaches have emerged on automatically finding the optimal DNN configurations. A technique among these approaches which show great promise is Evolutionary Algorithms (EA), which are based on observations from natural, biological processes. However, since the EA needs to evaluate multiple DNN candidates, and if the training time for a DNN is large, then the required time would be very large. A potential solution is to use Surrogate Assisted Evolutionary Algorithm (SAEA), in which a surrogate model is used to predict performance of DNNs without training. It is noted that all popular surrogate models in the literature require a fixed-length input, while encodings of a DNN are usually variable-length, since a DNN structure is very complex and its depths, sizes, etc. cannot be known beforehand. In this paper, we propose a novel surrogate model for variable-length encoding to optimise deep learning architecture. An encoder-decoder model is used to convert the variable-length encoding into a fixed-length representation, which is used as inputs to the surrogate model to predict the DNN performance without training. The weights of the encoder-decoder model are found via training on the variable-length data, with the targets being the same as the inputs, while the surrogate model is trained on the encoder output in the encoder-decoder model. In this study, a Long Short-Term Memory (LSTM) model is used as the encoder and decoder. Our proposed variable-length encoding based surrogate model is tested on a well-known method which evolves optimal Convolutional Neural Networks (CNNs). The experimental results show that our proposed method has competitive performance while significantly reducing the time of optimisation process.

Citation

DANG, T., NGUYEN, T.T., MCCALL, J., HAN, K. and LIEW, A.W.-C. 2024. A novel surrogate model for variable-length encoding and its application in optimising deep learning architecture. In Proceedings of the 2024 IEEE (Institute of Electrical and Electronics Engineers) Congress on evolutionary computation (CEC 2024), 30 June - 05 July 2024, Yokohama, Japan. Available from: https://doi.org/10.1109/CEC60901.2024.10611960

Presentation Conference Type Conference Paper (published)
Conference Name 2024 IEEE (Institute of Electrical and Electronics Engineers) Congress on evolutionary computation (CEC 2024)
Start Date Jun 30, 2024
End Date Jul 5, 2024
Acceptance Date Mar 15, 2024
Online Publication Date Jul 5, 2024
Publication Date Dec 31, 2024
Deposit Date Aug 16, 2024
Publicly Available Date Aug 16, 2024
Publisher Institute of Electrical and Electronics Engineers (IEEE)
Peer Reviewed Peer Reviewed
DOI https://doi.org/10.1109/cec60901.2024.10611960
Keywords Surrogate model; Variable-length encoding; Deep learning; Encoder-decoder; Sequence-to-sequence
Public URL https://rgu-repository.worktribe.com/output/2434446

Files

DANG 2024 A novel surrogate model (AAM) (553 Kb)
PDF

Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/

Copyright Statement
© 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.




You might also like



Downloadable Citations