Skip to main content

Research Repository

Advanced Search

Weighted ensemble of gross error detection methods based on particle swarm optimization. (2021)
Conference Proceeding
DOBOS, D., NGUYEN, T.T., MCCALL, J., WILSON, A., STOCKTON, P. and CORBETT, H. 2021. Weighted ensemble of gross error detection methods based on particle swarm optimization. In Chicano, F. (ed) Proceedings of the 2021 Genetic and evolutionary computation conference (GECCO 2021), 10-14 July 2021, [virtual conference]. New York: ACM [online], pages 307-308. Available from: https://doi.org/10.1145/3449726.3459415

Gross errors, a kind of non-random error caused by process disturbances or leaks, can make reconciled estimates can be very inaccurate and even infeasible. Detecting gross errors thus prevents financial loss from incorrectly accounting and also ident... Read More about Weighted ensemble of gross error detection methods based on particle swarm optimization..

VEGAS: a variable length-based genetic algorithm for ensemble selection in deep ensemble learning. (2021)
Conference Proceeding
HAN, K., PHAM, T., VU, T.H., DANG, T., MCCALL, J. and NGUYEN, T.T. 2021. VEGAS: a variable length-based genetic algorithm for ensemble selection in deep ensemble learning. In Nguyen, N.T., Chittayasothorn, S., Niyato, D. and Trawiński, B. (eds.) Intelligent information and database systems: proceedings of the 13th Asian conference on intelligent information and database systems 2021 (ACCIIDS 2021), 7-10 April 2021, [virtual conference]. Lecture Notes in Computer Science, 12672. Cham: Springer [online], pages 168–180. Available from: https://doi.org/10.1007/978-3-030-73280-6_14

In this study, we introduce an ensemble selection method for deep ensemble systems called VEGAS. The deep ensemble models include multiple layers of the ensemble of classifiers (EoC). At each layer, we train the EoC and generates training data for th... Read More about VEGAS: a variable length-based genetic algorithm for ensemble selection in deep ensemble learning..

A homogeneous-heterogeneous ensemble of classifiers. (2020)
Conference Proceeding
LUONG, A.V., VU, T.H., NGUYEN, P.M., VAN PHAM, N., MCCALL, J., LIEW, A.W.-C. and NGUYEN, T.T. 2020. A homogeneous-heterogeneous ensemble of classifiers. In Yang, H., Pasupa, K., Leung, A.C.-S., Kwok, J.T., Chan, J.H. and King, I. (eds.) Neural information processing: proceedings of 27th International conference on neural information processing 2020 (ICONIP 2020), part 5. Communications in computer and information science, 1333. Cham: Springer [online], pages, 251-259. Available from: https://doi.org/10.1007/978-3-030-63823-8_30

In this study, we introduce an ensemble system by combining homogeneous ensemble and heterogeneous ensemble into a single framework. Based on the observation that the projected data is significantly different from the original data as well as each ot... Read More about A homogeneous-heterogeneous ensemble of classifiers..

Toward an ensemble of object detectors. (2020)
Conference Proceeding
DANG, T., NGUYEN, T.T. and MCCALL, J. 2020. Toward an ensemble of object detectors. In Yang, H., Pasupa, K., Leung, A.C.-S., Kwok, J.T., Chan, J.H. and King, I. (eds.) Neural information processing: proceedings of 27th International conference on neural information processing 2020 (ICONIP 2020), part 5. Communications in computer and information science, 1333. Cham: Springer [online], pages, 458-467. Available from: https://doi.org/10.1007/978-3-030-63823-8_53

The field of object detection has witnessed great strides in recent years. With the wave of deep neural networks (DNN), many breakthroughs have achieved for the problems of object detection which previously were thought to be difficult. However, ther... Read More about Toward an ensemble of object detectors..

Heterogeneous ensemble selection for evolving data streams. [Dataset] (2020)
Dataset
LUONG, A.V., NGUYEN, T.T., LIEW, A.W.-C. and WANG, S. 2021. Heterogeneous ensemble selection for evolving data streams. [Dataset]. Pattern recognition [online], 112, article ID 107743. Available from: https://www.sciencedirect.com/science/article/pii/S003132032030546X#sec0023

Ensemble learning has been widely applied to both batch data classification and streaming data classification. For the latter setting, most existing ensemble systems are homogenous, which means they are generated from only one type of learning model.... Read More about Heterogeneous ensemble selection for evolving data streams. [Dataset].

Heterogeneous ensemble selection for evolving data streams. (2020)
Journal Article
LUONG, A.V., NGUYEN, T.T., LIEW, A.W.-C. and WANG, S. 2021. Heterogeneous ensemble selection for evolving data streams. Pattern recognition [online], 112, article ID 107743. Available from: https://doi.org/10.1016/j.patcog.2020.107743

Ensemble learning has been widely applied to both batch data classification and streaming data classification. For the latter setting, most existing ensemble systems are homogenous, which means they are generated from only one type of learning model.... Read More about Heterogeneous ensemble selection for evolving data streams..

WEC: weighted ensemble of text classifiers. (2020)
Conference Proceeding
UPADHYAY, A., NGUYEN, T.T., MASSIE, S. and MCCALL, J. 2020. WEC: weighted ensemble of text classifiers. In Proceedings of 2020 Institute of Electrical and Electronics Engineers (IEEE) congress on evolutionary computation (IEEE CEC 2020), part of the 2020 (IEEE) World congress on computational intelligence (IEEE WCCI 2020) and co-located with the 2020 International joint conference on neural networks (IJCNN 2020) and the 2020 IEEE International fuzzy systems conference (FUZZ-IEEE 2020), 19-24 July 2020, Glasgow, UK [virtual conference]. Piscataway: IEEE [online], article ID 9185641. Available from: https://doi.org/10.1109/CEC48606.2020.9185641

Text classification is one of the most important tasks in the field of Natural Language Processing. There are many approaches that focus on two main aspects: generating an effective representation; and selecting and refining algorithms to build the c... Read More about WEC: weighted ensemble of text classifiers..

Evolved ensemble of detectors for gross error detection. (2020)
Conference Proceeding
NGUYEN, T.T., MCCALL, J., WILSON, A., OCHEI, L., CORBETT, H. and STOCKTON, P. 2020. Evolved ensemble of detectors for gross error detection. In GECCO '20: proceedings of the Genetic and evolutionary computation conference companion (GECCO 2020), 8-12 July 2020, Cancún, Mexico. New York: ACM [online], pages 281-282. Available from: https://doi.org/10.1145/3377929.3389906

In this study, we evolve an ensemble of detectors to check the presence of gross systematic errors on measurement data. We use the Fisher method to combine the output of different detectors and then test the hypothesis about the presence of gross err... Read More about Evolved ensemble of detectors for gross error detection..

Multi-layer heterogeneous ensemble with classifier and feature selection. (2020)
Conference Proceeding
NGUYEN, T.T., VAN PHAM, N., DANG, M.T., LUONG, A.V., MCCALL, J. and LIEW, A.W.C. 2020. Multi-layer heterogeneous ensemble with classifier and feature selection. In GECCO '20: proceedings of the Genetic and evolutionary computation conference (GECCO 2020), 8-12 July 2020, Cancun, Mexico. New York: ACM [online], pages 725-733. Available from: https://doi.org/10.1145/3377930.3389832

Deep Neural Networks have achieved many successes when applying to visual, text, and speech information in various domains. The crucial reasons behind these successes are the multi-layer architecture and the in-model feature transformation of deep le... Read More about Multi-layer heterogeneous ensemble with classifier and feature selection..

Evolving interval-based representation for multiple classifier fusion. (2020)
Journal Article
NGUYEN, T.T., DANG,M.T., BAGHEL, V.A., LUONG, A.V., MCCALL, J. and LIEW, A.W.-C. 2020 Evolving interval-based representation for multiple classifier fusion. Knowledge-based systems [online], 201-202, article ID 106034. Available from: https://doi.org/10.1016/j.knosys.2020.106034

Designing an ensemble of classifiers is one of the popular research topics in machine learning since it can give better results than using each constituent member. Furthermore, the performance of ensemble can be improved using selection or adaptation... Read More about Evolving interval-based representation for multiple classifier fusion..

Confidence in prediction: an approach for dynamic weighted ensemble. (2020)
Conference Proceeding
DO D.T., NGUYEN T.T., NGUYEN T.T., LUONG A.V., LIEW A.W.-C. and MCCALL J. 2020. Confidence in prediction: an approach for dynamic weighted ensemble. In Nguyen N., Jearanaitanakij K., Selamat A., Trawi?ski B. and Chittayasothorn S. (eds.) Intelligent information and database systems: proceedings of the 12th Asian intelligent information and database systems conference (ACIIDS 2020), 23-26 March 2020, Phuket, Thailand. Lecture Notes in Computer Science, 12033. Cham: Springer [online], part 1, pages 358-370. Available from: https://doi.org/10.1007/978-3-030-41964-6_31

Combining classifiers in an ensemble is beneficial in achieving better prediction than using a single classifier. Furthermore, each classifier can be associated with a weight in the aggregation to boost the performance of the ensemble system. In this... Read More about Confidence in prediction: an approach for dynamic weighted ensemble..

Deep heterogeneous ensemble. (2019)
Journal Article
NGUYEN, T.T., DANG, M.T., PHAM, T.D., DAO, L.P., LUONG, A.V., MCCALL, J. and LIEW, A.W.C. 2019. Deep heterogeneous ensemble. Australian journal of intelligent information processing systems [online], 16(1): special issue on neural information processing: proceedings of the 26th International conference on neural information processing (ICONIP 2019), 12-15 December 2019, Sydney, Australia, pages 1-9. Available from: http://ajiips.com.au/papers/V16.1/v16n1_5-13.pdf

In recent years, deep neural networks (DNNs) have emerged as a powerful technique in many areas of machine learning. Although DNNs have achieved great breakthrough in processing images, video, audio and text, it also has some limitations... Read More about Deep heterogeneous ensemble..

Evolving an optimal decision template for combining classifiers. (2019)
Conference Proceeding
NGUYEN, T.T., LUONG, A.V., DANG, M.T., DAO, L.P., NGUYEN, T.T.T., LIEW, A.W.-C. and MCCALL, J. 2019. Evolving an optimal decision template for combining classifiers. In Gedeon, T., Wong, K.W. and Lee, M. (eds.) Neural information processing: proceedings of the 26th International conference on neural information processing (ICONIP 2019), 12-15 December 2019, Sydney, Australia. Part I. Lecture notes in computer science, 11953. Cham: Springer [online], pages 608-620. Available from: https://doi.org/10.1007/978-3-030-36708-4_50

In this paper, we aim to develop an effective combining algorithm for ensemble learning systems. The Decision Template method, one of the most popular combining algorithms for ensemble systems, does not perform well when working on certain datasets l... Read More about Evolving an optimal decision template for combining classifiers..

Ensemble selection based on classifier prediction confidence. (2019)
Journal Article
NGUYEN, T.T., LUONG, A.V., DANG, M.T., LIEW, A.W.-C. and MCCALL, J. 2020. Ensemble selection based on classifier prediction confidence. Pattern recognition [online], 100, article ID 107104. Available from: https://doi.org/10.1016/j.patcog.2019.107104

Ensemble selection is one of the most studied topics in ensemble learning because a selected subset of base classifiers may perform better than the whole ensemble system. In recent years, a great many ensemble selection methods have been introduced.... Read More about Ensemble selection based on classifier prediction confidence..