ASHISH UPADHYAY a.upadhyay@rgu.ac.uk
Completed Research Student
ASHISH UPADHYAY a.upadhyay@rgu.ac.uk
Completed Research Student
Dr Thanh Nguyen t.nguyen11@rgu.ac.uk
Senior Research Fellow
Dr Stewart Massie s.massie@rgu.ac.uk
Associate Professor
Professor John McCall j.mccall@rgu.ac.uk
Professorial Lead
Text classification is one of the most important tasks in the field of Natural Language Processing. There are many approaches that focus on two main aspects: generating an effective representation; and selecting and refining algorithms to build the classification model. Traditional machine learning methods represent documents in vector space using features such as term frequencies, which have limitations in handling the order and semantics of words. Meanwhile, although achieving many successes, deep learning classifiers require substantial resources in terms of labelled data and computational complexity. In this work, a weighted ensemble of classifiers (WEC) is introduced to address the text classification problem. Instead of using majority vote as the combining method, we propose to associate each classifier’s prediction with a different weight when combining classifiers. The optimal weights are obtained by minimising a loss function on the training data with the Particle Swarm Optimisation algorithm. We conducted experiments on 5 popular datasets and report classification performance of algorithms with classification accuracy and macro F1 score. WEC was run with several different combinations of traditional machine learning and deep learning classifiers to show its flexibility and robustness. Experimental results confirm the advantage of WEC, especially on smaller datasets.
UPADHYAY, A., NGUYEN, T.T., MASSIE, S. and MCCALL, J. 2020. WEC: weighted ensemble of text classifiers. In Proceedings of 2020 Institute of Electrical and Electronics Engineers (IEEE) congress on evolutionary computation (IEEE CEC 2020), part of the 2020 (IEEE) World congress on computational intelligence (IEEE WCCI 2020) and co-located with the 2020 International joint conference on neural networks (IJCNN 2020) and the 2020 IEEE International fuzzy systems conference (FUZZ-IEEE 2020), 19-24 July 2020, Glasgow, UK [virtual conference]. Piscataway: IEEE [online], article ID 9185641. Available from: https://doi.org/10.1109/CEC48606.2020.9185641
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 2020 Institute of Electrical and Electronics Engineers (IEEE) congress on evolutionary computation (IEEE CEC 2020), part of the 2020 (IEEE) World congress on computational intelligence (IEEE WCCI 2020) and co-located with the 2020 International joint conf |
Start Date | Jul 19, 2020 |
End Date | Jul 24, 2020 |
Acceptance Date | Mar 20, 2020 |
Online Publication Date | Jul 24, 2020 |
Publication Date | Sep 3, 2020 |
Deposit Date | May 14, 2020 |
Publicly Available Date | May 14, 2020 |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Peer Reviewed | Peer Reviewed |
DOI | https://doi.org/10.1109/CEC48606.2020.9185641 |
Keywords | Text classification; Ensemble method; Ensemble of classifiers; Multiple classifiers; Particle swarm optimisation |
Public URL | https://rgu-repository.worktribe.com/output/905942 |
UDADHYAY 2020 WEC
(362 Kb)
PDF
Case-based approach to automated natural language generation for obituaries.
(2020)
Presentation / Conference Contribution
GEMv2: multilingual NLG benchmarking in a single line of code.
(2022)
Presentation / Conference Contribution
A case-based approach to data-to-text generation.
(2021)
Presentation / Conference Contribution
A case-based approach for content planning in data-to-text generation.
(2022)
Presentation / Conference Contribution
Content type profiling of data-to-text generation datasets.
(2022)
Presentation / Conference Contribution
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search