Professor Nirmalie Wiratunga n.wiratunga@rgu.ac.uk
Associate Dean for Research
Professor Nirmalie Wiratunga n.wiratunga@rgu.ac.uk
Associate Dean for Research
Anjana Wijekoon
Dr Ikechukwu Nkisi-Orji i.nkisi-orji@rgu.ac.uk
Chancellor's Fellow
Dr Kyle Martin k.martin3@rgu.ac.uk
Lecturer
Mr Chamath Palihawadana c.palihawadana@rgu.ac.uk
Research Assistant
Dr David Corsar d.corsar1@rgu.ac.uk
Senior Lecturer
Hayley Borck
Editor
Viktor Eisenstadt
Editor
Antonio S�nchez-Ruiz
Editor
Michael Floyd
Editor
Counterfactual explanations focus on 'actionable knowledge' to help end-users understand how a Machine Learning model outcome could be changed to a more desirable outcome. For this purpose a counterfactual explainer needs to be able to reason with similarity knowledge in order to discover input dependencies that relate to outcome changes. Identifying the minimum subset of feature changes to action a change in the decision is an interesting challenge for counterfactual explainers. In this paper we show how feature relevance based explainers (i.e. LIME, SHAP), can inform a counterfactual explainer to identify the minimum subset of 'actionable features'. We demonstrate our DisCERN (Discovering Counterfactual Explanations using Relevance Features from Neighbourhoods) algorithm on three datasets and compare against the widely used counterfactual approach DiCE. Our preliminary results show that DisCERN to be a viable strategy that should be adopted to minimise the actionable changes.
WIRATUNGA, N., WIJEKOON, A., NKISI-ORJI, I., MARTIN, K., PALIHAWADANA, C. and CORSAR, D. 2021. Actionable feature discovery in counterfactuals using feature relevance explainers. In Borck, H., Eisenstadt, V., Sánchez-Ruiz, A. and Floyd, M. (eds.) Workshop proceedings of the 29th International conference on case-based reasoning (ICCBR-WS 2021), 13-16 September 2021, [virtual event]. CEUR workshop proceedings, 3017. Aachen: CEUR-WS [online], pages 63-74. Available from: http://ceur-ws.org/Vol-3017/101.pdf
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | Workshops of the 29th International conference on case-based reasoning (ICCBR-WS 2021) |
Start Date | Sep 13, 2021 |
End Date | Sep 16, 2021 |
Acceptance Date | Jun 11, 2021 |
Online Publication Date | Sep 16, 2021 |
Publication Date | Nov 24, 2021 |
Deposit Date | Jan 6, 2022 |
Publicly Available Date | Jan 6, 2022 |
Publisher | CEUR-WS |
Peer Reviewed | Peer Reviewed |
Pages | 63-74 |
Series Title | CEUR workshop proceedings |
Series Number | 3017 |
Series ISSN | 1613-0073 |
Keywords | Explainable AI; Counterfactual; Feature relevance; Actionable features |
Public URL | https://rgu-repository.worktribe.com/output/1563535 |
Publisher URL | http://ceur-ws.org/Vol-3017/101.pdf |
WIRATUNGA 2021 Actionable feature discovery (VOR v2)
(532 Kb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
FedSim: similarity guided model aggregation for federated learning.
(2021)
Journal Article
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search