Skip to main content

Research Repository

Advanced Search

Outputs (24)

On modeling the dynamic thermal behavior of electrical machines using genetic programming and artificial neural networks. (2020)
Presentation / Conference Contribution
ZAVOIANU, A.-C., KITZBERGER, M., BRAMERDORFER, G. and SAMINGER-PLATZ, S. 2020. On modeling the dynamic thermal behavior of electrical machines using genetic programming and artificial neural networks. In Moreno-Díaz, R., Pichler, F. and Quesada-Arencibia, A. (eds.) Computer aided systems theory: EUROCAST 2019: revised selected papers from the proceedings of the 17th International conference on computer aided systems theory (EUROCAST 2019), 17-22 February 2019, Las Palmas de Gran Canaria, Spain. Lecture notes in computer science, 12013. Cham: Springer [online], part 1, pages 319-326. Available from: https://doi.org/10.1007/978-3-030-45093-9_39

We describe initial attempts to model the dynamic thermal behavior of electrical machines by evaluating the ability of linear and non-linear (regression) modeling techniques to replicate the performance of simulations carried out using a lumped param... Read More about On modeling the dynamic thermal behavior of electrical machines using genetic programming and artificial neural networks..

Confidence in prediction: an approach for dynamic weighted ensemble. (2020)
Presentation / Conference Contribution
DO D.T., NGUYEN T.T., NGUYEN T.T., LUONG A.V., LIEW A.W.-C. and MCCALL J. 2020. Confidence in prediction: an approach for dynamic weighted ensemble. In Nguyen N., Jearanaitanakij K., Selamat A., Trawiński B. and Chittayasothorn S. (eds.) Intelligent information and database systems: proceedings of the 12th Asian intelligent information and database systems conference (ACIIDS 2020), 23-26 March 2020, Phuket, Thailand. Lecture Notes in Computer Science, 12033. Cham: Springer [online], part 1, pages 358-370. Available from: https://doi.org/10.1007/978-3-030-41964-6_31

Combining classifiers in an ensemble is beneficial in achieving better prediction than using a single classifier. Furthermore, each classifier can be associated with a weight in the aggregation to boost the performance of the ensemble system. In this... Read More about Confidence in prediction: an approach for dynamic weighted ensemble..

Deep heterogeneous ensemble. (2019)
Presentation / Conference Contribution
NGUYEN, T.T., DANG, M.T., PHAM, T.D., DAO, L.P., LUONG, A.V., MCCALL, J. and LIEW, A.W.C. 2019. Deep heterogeneous ensemble. Australian journal of intelligent information processing systems [online], 16(1): special issue on neural information processing: proceedings of the 26th International conference on neural information processing (ICONIP 2019), 12-15 December 2019, Sydney, Australia, pages 1-9. Available from: http://ajiips.com.au/papers/V16.1/v16n1_5-13.pdf

In recent years, deep neural networks (DNNs) have emerged as a powerful technique in many areas of machine learning. Although DNNs have achieved great breakthrough in processing images, video, audio and text, it also has some limitations... Read More about Deep heterogeneous ensemble..

Evolving an optimal decision template for combining classifiers. (2019)
Presentation / Conference Contribution
NGUYEN, T.T., LUONG, A.V., DANG, M.T., DAO, L.P., NGUYEN, T.T.T., LIEW, A.W.-C. and MCCALL, J. 2019. Evolving an optimal decision template for combining classifiers. In Gedeon, T., Wong, K.W. and Lee, M. (eds.) Neural information processing: proceedings of the 26th International conference on neural information processing (ICONIP 2019), 12-15 December 2019, Sydney, Australia. Part I. Lecture notes in computer science, 11953. Cham: Springer [online], pages 608-620. Available from: https://doi.org/10.1007/978-3-030-36708-4_50

In this paper, we aim to develop an effective combining algorithm for ensemble learning systems. The Decision Template method, one of the most popular combining algorithms for ensemble systems, does not perform well when working on certain datasets l... Read More about Evolving an optimal decision template for combining classifiers..