Skip to main content

Research Repository

Advanced Search

All Outputs (13)

Which classifiers are connected to others? An optimal connection framework for multi-layer ensemble systems. (2024)
Journal Article
DANG, T., NGUYEN, T.T., LIEW, A.W.-C., ELYAN, E. and MCCALL, J. 2024. Which classifiers are connected to others? An optimal connection framework for multi-layer ensemble systems. Knowledge-based systems [online], 304, article number 112522. Available from: https://doi.org/10.1016/j.knosys.2024.112522

Ensemble learning is a powerful machine learning strategy that combines multiple models e.g. classifiers to improve predictions beyond what any single model can achieve. Until recently, traditional ensemble methods typically use only one layer of mod... Read More about Which classifiers are connected to others? An optimal connection framework for multi-layer ensemble systems..

Two-layer ensemble of deep learning models for medical image segmentation. (2024)
Journal Article
DANG, T., NGUYEN, T.T., MCCALL, J., ELYAN, E. and MORENO-GARCÍA, C.F. 2024. Two-layer ensemble of deep learning models for medical image segmentation. Cognitive computation [online], 16(3), pages 1141-1160. Available from: https://doi.org/10.1007/s12559-024-10257-5

One of the most important areas in medical image analysis is segmentation, in which raw image data is partitioned into structured and meaningful regions to gain further insights. By using Deep Neural Networks (DNN), AI-based automated segmentation al... Read More about Two-layer ensemble of deep learning models for medical image segmentation..

DEFEG: deep ensemble with weighted feature generation. (2023)
Journal Article
LUONG, A.V., NGUYEN, T.T., HAN, K., VU, T.H., MCCALL, J. and LIEW, A.W.-C. 2023. DEFEG: deep ensemble with weighted feature generation. Knowledge-based systems [online], 275, article 110691. Available from: https://doi.org/10.1016/j.knosys.2023.110691

With the significant breakthrough of Deep Neural Networks in recent years, multi-layer architecture has influenced other sub-fields of machine learning including ensemble learning. In 2017, Zhou and Feng introduced a deep random forest called gcFores... Read More about DEFEG: deep ensemble with weighted feature generation..

A comparative study of anomaly detection methods for gross error detection problems. (2023)
Journal Article
DOBOS, D., NGUYEN, T.T., DANG, T., WILSON, A., CORBETT, H., MCCALL, J. and STOCKTON, P. 2023. A comparative study of anomaly detection methods for gross error detection problems. Computers and chemical engineering [online], 175, article 108263. Available from: https://doi.org/10.1016/j.compchemeng.2023.108263

The chemical industry requires highly accurate and reliable measurements to ensure smooth operation and effective monitoring of processing facilities. However, measured data inevitably contains errors from various sources. Traditionally in flow syste... Read More about A comparative study of anomaly detection methods for gross error detection problems..

Heterogeneous ensemble selection for evolving data streams. (2020)
Journal Article
LUONG, A.V., NGUYEN, T.T., LIEW, A.W.-C. and WANG, S. 2021. Heterogeneous ensemble selection for evolving data streams. Pattern recognition [online], 112, article ID 107743. Available from: https://doi.org/10.1016/j.patcog.2020.107743

Ensemble learning has been widely applied to both batch data classification and streaming data classification. For the latter setting, most existing ensemble systems are homogenous, which means they are generated from only one type of learning model.... Read More about Heterogeneous ensemble selection for evolving data streams..

Evolving interval-based representation for multiple classifier fusion. (2020)
Journal Article
NGUYEN, T.T., DANG,M.T., BAGHEL, V.A., LUONG, A.V., MCCALL, J. and LIEW, A.W.-C. 2020 Evolving interval-based representation for multiple classifier fusion. Knowledge-based systems [online], 201-202, article ID 106034. Available from: https://doi.org/10.1016/j.knosys.2020.106034

Designing an ensemble of classifiers is one of the popular research topics in machine learning since it can give better results than using each constituent member. Furthermore, the performance of ensemble can be improved using selection or adaptation... Read More about Evolving interval-based representation for multiple classifier fusion..

Ensemble selection based on classifier prediction confidence. (2019)
Journal Article
NGUYEN, T.T., LUONG, A.V., DANG, M.T., LIEW, A.W.-C. and MCCALL, J. 2020. Ensemble selection based on classifier prediction confidence. Pattern recognition [online], 100, article ID 107104. Available from: https://doi.org/10.1016/j.patcog.2019.107104

Ensemble selection is one of the most studied topics in ensemble learning because a selected subset of base classifiers may perform better than the whole ensemble system. In recent years, a great many ensemble selection methods have been introduced.... Read More about Ensemble selection based on classifier prediction confidence..

Multi-label classification via incremental clustering on an evolving data stream. (2019)
Journal Article
NGUYEN, T.T., DANG, M.T., LUONG, A.V., LIEW, A. W.-C., LIANG, T. and MCCALL, J. 2019. Multi-label classification via incremental clustering on an evolving data stream. Pattern recognition [online], 95, pages 96-113. Available from: https://doi.org/10.1016/j.patcog.2019.06.001

With the advancement of storage and processing technology, an enormous amount of data is collected on a daily basis in many applications. Nowadays, advanced data analytics have been used to mine the collected data for useful information and make pred... Read More about Multi-label classification via incremental clustering on an evolving data stream..

A weighted multiple classifier framework based on random projection. (2019)
Journal Article
NGUYEN, T.T., DANG, M.T., LIEW, A. W.-C. and BEZDEK, J.C. 2019. A weighted multiple classifier framework based on random projection. Information science [online], 490, pages 36-58. Available from: https://doi.org/10.1016/j.ins.2019.03.067

In this paper, we propose a weighted multiple classifier framework based on random projections. Similar to the mechanism of other homogeneous ensemble methods, the base classifiers in our approach are obtained by a learning algorithm on different tra... Read More about A weighted multiple classifier framework based on random projection..

A lossless online Bayesian classifier. (2019)
Journal Article
NGUYEN, T.T.T., NGUYEN, T.T., SHARMA, R. and LIEW, A. W.-C. 2019. A lossless online Bayesian classifier. Information sciences [online], 489, pages 1-17. Available from: https://doi.org/10.1016/j.ins.2019.03.031

We are living in a world progressively driven by data. Besides the issue that big data cannot be entirely stored in the main memory as required by traditional offline learning methods, the problem of learning data that can only be collected over time... Read More about A lossless online Bayesian classifier..

Multi-label classification via label correlation and first order feature dependance in a data stream. (2019)
Journal Article
NGUYEN, T.T., NGUYEN, T.T.T., LUONG, A.V., NGUYEN, Q.V.H., LIEW, A.W.-C. and STANTIC, B. 2019. Multi-label classification via label correlation and first order feature dependance in a data stream. Pattern recognition [online], 90, pages 35-51. Available from: https://doi.org/10.1016/j.patcog.2019.01.007

Many batch learning algorithms have been introduced for offline multi-label classification (MLC) over the years. However, the increasing data volume in many applications such as social networks, sensor networks, and traffic monitoring has posed many... Read More about Multi-label classification via label correlation and first order feature dependance in a data stream..

Combining heterogeneous classifiers via granular prototypes. (2018)
Journal Article
NGUYEN, T.T., NGUYEN, M.P., PHAM, X.C., LIEW, A. W.-C. and PEDRYCZ, W. 2018. Combining heterogeneous classifiers via granular prototypes. Applied soft computing [online], 73, pages 795-815. Available from: https://doi.org/10.1016/j.asoc.2018.09.021

In this study, a novel framework to combine multiple classifiers in an ensemble system is introduced. Here we exploit the concept of information granule to construct granular prototypes for each class on the outputs of an ensemble of base classifiers... Read More about Combining heterogeneous classifiers via granular prototypes..

Aggregation of classifiers: a justifiable information granularity approach. (2018)
Journal Article
NGUYEN, T.T., PHAM, X.C., LIEW, A.W.-C. and PEDRYCZ, W. 2019. Aggregation of classifiers: a justifiable information granularity approach. IEEE transactions on cybernetics [online], 49(6), pages 2168-2177. Available from: https://doi.org/10.1109/TCYB.2018.2821679

In this paper, we introduced a new approach of combining multiple classifiers in a heterogeneous ensemble system. Instead of using numerical membership values when combining, we constructed interval membership values for each class prediction from th... Read More about Aggregation of classifiers: a justifiable information granularity approach..