Dr Kyle Martin k.martin3@rgu.ac.uk
Lecturer
Dr Kyle Martin k.martin3@rgu.ac.uk
Lecturer
Dr Anjana Wijekoon a.wijekoon1@rgu.ac.uk
Research Fellow B
Professor Nirmalie Wiratunga n.wiratunga@rgu.ac.uk
Associate Dean for Research
Mr Chamath Palihawadana c.palihawadana@rgu.ac.uk
Research Assistant
Dr Ikechukwu Nkisi-Orji i.nkisi-orji@rgu.ac.uk
Chancellor's Fellow
Dr David Corsar d.corsar1@rgu.ac.uk
Senior Lecturer
Belén Díaz-Agudo
Juan A. Recio-García
Marta Caro-Martínez
Derek Bridge
Preeja Pradeep
Anne Liret
Bruno Fleisch
Pascal Reuss
Editor
Jakob Schönborn
Editor
The right to an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. However, different system stakeholders may have different background knowledge, competencies and goals, thus requiring different kinds of explanations. There is a growing armoury of XAI methods, interpreting ML models and explaining their predictions, recommendations and diagnoses. We refer to these collectively as "explanation strategies". As these explanation strategies mature, practitioners gain experience in understanding which strategies to deploy in different circumstances. What is lacking, and what the iSee project will address, is the science and technology for capturing, sharing and re-using explanation strategies based on similar user experiences, along with a much-needed route to explainable AI (XAI) compliance. Our vision is to improve every user's experience of AI, by harnessing experiences of best practice in XAI by providing an interactive environment where personalised explanation experiences are accessible to everyone. Video Link: https://youtu.be/81O6-q_yx0s
MARTIN, K., WIJEKOON, A., WIRATUNGA, N., PALIHAWADANA, C., NKISI-ORJIC, I., CORSAR, D., DÍAZ-AGUDO, B., RECIO-GARCÍA, J.A., CARO-MARTÍNEZ, M., BRIDGE, D., PRADEEP, P., LIRET, A. and FLEISCH, B. 2022. iSee: intelligent sharing of explanation experiences. In Reuss, P. and Schönborn, J. (eds.) ICCBR-WS 2022: proceedings of the 30th International conference on Case-based reasoning workshops 2022 (ICCBR-WS 2022) co-located with the 30th International conference on Case-based reasoning 2022 (ICCBR 2022), 12-15 September 2022, Nancy, France. Aachen: CEUR workshop proceedings [online], 3389, pages 231-232. Available from: https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_83.pdf
Conference Name | 30th International conference on Case-based reasoning workshops 2022 (ICCBR-WS 2022) co-located with the 30th International conference on Case-based reasoning 2022 (ICCBR 2022) |
---|---|
Conference Location | Nancy, France |
Start Date | Sep 12, 2022 |
End Date | Sep 15, 2022 |
Acceptance Date | Jul 22, 2022 |
Online Publication Date | May 11, 2023 |
Publication Date | May 11, 2023 |
Deposit Date | Jun 2, 2023 |
Publicly Available Date | Jun 2, 2023 |
Journal | CEUR Workshop Proceedings |
Print ISSN | 1613-0073 |
Publisher | CEUR Workshop Proceedings |
Volume | 3389 |
Pages | 231-232 |
Series ISSN | 1613-0073 |
Book Title | ICCBR-WS 2022: proceedings of the 30th International conference on Case-based reasoning workshops 2022 (ICCBR-WS 2022) co-located with the 30th International conference on Case-based reasoning 2022 (ICCBR 2022) |
Keywords | Explainability; Case-based reasoning; Project showcase |
Public URL | https://rgu-repository.worktribe.com/output/1977718 |
Publisher URL | https://ceur-ws.org/Vol-3389/ |
MARTIN 2022 iSee intelligent sharing (VOR)
(977 Kb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
Copyright Statement
© 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
Clinical dialogue transcription error correction using Seq2Seq models.
(2022)
Conference Proceeding
Clinical dialogue transcription error correction using Seq2Seq models.
(2022)
Working Paper
DisCERN: discovering counterfactual explanations using relevance features from neighbourhoods.
(2021)
Conference Proceeding
Actionable feature discovery in counterfactuals using feature relevance explainers.
(2021)
Conference Proceeding
Counterfactual explanations for student outcome prediction with Moodle footprints.
(2021)
Conference Proceeding
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Advanced Search