Marta Caro-Martínez
Conceptual modelling of explanation experiences through the iSeeonto ontology.
Caro-Martínez, Marta; Wijekoon, Anjana; Recio-García, Juan A.; Corsar, David; Nkisi-Orji, Ikechukwu
Authors
Dr Anjana Wijekoon a.wijekoon1@rgu.ac.uk
Research Fellow B
Juan A. Recio-García
Dr David Corsar d.corsar1@rgu.ac.uk
Senior Lecturer
Dr Ikechukwu Nkisi-Orji i.nkisi-orji@rgu.ac.uk
Chancellor's Fellow
Contributors
Pascal Reuss
Editor
Jakob Schönborn
Editor
Abstract
Explainable Artificial Intelligence is a big research field required in many situations where we need to understand Artificial Intelligence behaviour. However, each explanation need is unique which makes it difficult to apply explanation techniques and solutions that are already implemented when faced with a new problem. Therefore, the task to implement an explanation system can be very challenging because we need to take the AI model into account, user's needs and goals, available data, suitable explainers, etc. In this work, we propose a formal model to define and orchestrate all the elements involved in an explanation system, and make a novel contribution regarding the formalisation of this model as the iSeeOnto ontology. This ontology not only enables the conceptualisation of a wide range of explanation systems, but also supports the application of Case-Based Reasoning as a knowledge transfer approach that reuses previous explanation experiences from unrelated domains. To demonstrate the suitability of the proposed model, we present an exhaustive validation by classifying reference explanation systems found in the literature into the iSeeOnto ontology.
Citation
CARO-MARTÍNEZ, M., WIJEKOON, A., RECIO-GARCÍA, J.A., CORSAR, D. and NKISI-ORJI, I. 2022. Conceptual modelling of explanation experiences through the iSeeonto ontology. In Reuss, P. and Schönborn, J. (eds.) ICCBR-WS 2022: proceedings of the 30th International conference on Case-based reasoning workshops 2022 (ICCBR-WS 2022) co-located with the 30th International conference on Case-based reasoning 2022 (ICCBR 2022), 12-15 September 2022, Nancy, France. Aachen: CEUR workshop proceedings [online], 3389, pages 117-128. Available from: https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_86.pdf
Conference Name | 30th International conference on Case-based reasoning workshops 2022 (ICCBR-WS 2022) co-located with the 30th International conference on Case-based reasoning 2022 (ICCBR 2022) |
---|---|
Conference Location | Nancy, France |
Start Date | Sep 12, 2022 |
End Date | Sep 15, 2022 |
Acceptance Date | Jul 22, 2022 |
Online Publication Date | May 11, 2023 |
Publication Date | May 11, 2023 |
Deposit Date | Jun 2, 2023 |
Publicly Available Date | Jun 2, 2023 |
Journal | CEUR Workshop Proceedings |
Print ISSN | 1613-0073 |
Publisher | CEUR Workshop Proceedings |
Volume | 3389 |
Pages | 117-128 |
Series ISSN | 1613-0073 |
Book Title | ICCBR-WS 2022: proceedings of the 30th International conference on Case-based reasoning workshops 2022 (ICCBR-WS 2022) co-located with the 30th International conference on Case-based reasoning 2022 (ICCBR 2022) |
Keywords | XAI; Ontology; Conceptual model; CBR |
Public URL | https://rgu-repository.worktribe.com/output/1977681 |
Publisher URL | https://ceur-ws.org/Vol-3389/ |
Files
CARO-MAARTINEZ 2022 Conceptual modelling (VOR)
(3 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
Copyright Statement
© 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
You might also like
iSee: intelligent sharing of explanation experience of users for users.
(2023)
Conference Proceeding
iSee: demonstration video. [video recording]
(2023)
Digital Artefact
Clinical dialogue transcription error correction using Seq2Seq models.
(2022)
Conference Proceeding
Clinical dialogue transcription error correction using Seq2Seq models.
(2022)
Working Paper
DisCERN: discovering counterfactual explanations using relevance features from neighbourhoods.
(2021)
Conference Proceeding