Marta Caro-Martínez
Building personalised XAI experiences through iSee: a case-based reasoning-driven platform.
Caro-Martínez, Marta; Liret, Anne; Díaz-Agudo, Belén; Recio-García, Juan A.; Darias, Jesús; Wiratunga, Nirmalie; Wijekoon, Anjana; Martin, Kyle; Nkisi-Orji, Ikechukwu; Corsar, David; Palihawadana, Chamath; Pirie, Craig; Bridge, Derek; Pradeep, Preeja; Fleisch, Bruno
Authors
Anne Liret
Belén Díaz-Agudo
Juan A. Recio-García
Jesús Darias
Professor Nirmalie Wiratunga n.wiratunga@rgu.ac.uk
Associate Dean for Research
Anjana Wijekoon
Dr Kyle Martin k.martin3@rgu.ac.uk
Lecturer
Dr Ikechukwu Nkisi-Orji i.nkisi-orji@rgu.ac.uk
Chancellor's Fellow
Dr David Corsar d.corsar1@rgu.ac.uk
Senior Lecturer
Mr Chamath Palihawadana c.palihawadana@rgu.ac.uk
Research Assistant
Mr Craig Pirie c.pirie11@rgu.ac.uk
Research Assistant
Derek Bridge
Preeja Pradeep
Bruno Fleisch
Contributors
Luca Longo
Editor
Weiru Liu
Editor
Grégoire Montavon
Editor
Abstract
Nowadays, eXplainable Artificial Intelligence (XAI) is well-known as an important field in Computer Science due to the necessity of understanding the increasing complexity of Artificial Intelligence (AI) systems or algorithms. This is the reason why we can find a wide variety of explanation techniques (explainers) in the literature, on top of some XAI libraries. The challenge faced by XAI designers here is deciding what explainers are the most suitable for each scenario, taking into account the AI model, task to explain, user preferences, needs and knowledge, and overall, fitting into the explanation requirements. With the aim of addressing this problem, the iSee project was conceived to provide XAI design users with supporting tools to build their own explanation experiences. As a result, we have developed iSee, a Case-Based Reasoning-driven platform that allows users to create personalised explanation experiences. With the iSee platform, users add their explanation experience requirements, and get the most suitable XAI strategies to explain their own situation, taking advantage of XAI strategies previously used with success in similar context. The iSee platform is composed of different tools and modules: the ontology, the cockpit, the explainer library, the Explanation Experiences Editor (iSeeE3), the chatbot, and the analytics dashboard. This paper introduces these tools as a demo and tutorial for current and future users and for the XAI community.
Citation
CARO-MARTÍNEZ, M., LIRET, A., DÍAZ-AGUDO, B., RECIO-GARCÍA, J.A., DARIAS, J., WIRATUNGA, N., WIJEKOON, A., MARTIN, K., NKISI-ORJI, I., CORSAR, D., PALIHAWADANA, C., PIRIE, C., BRIDGE, D., PRADEEP, P. and FLEISCH, B. 2024. Building personalised XAI experiences through iSee: a case-based reasoning-driven platform. In Longo, L., Liu, W. and Montavon, G. (eds.) xAI-2024: LB/D/DC: joint proceedings of the xAI 2024 late-breaking work, demos and doctoral consortium, co-located with the 2nd World conference on eXplainable artificial intelligence (xAI 2024), 17-19 July 2024, Valletta, Malta. Aachen: CEUR-WS [online], 3793, pages 313-320. Available from: https://ceur-ws.org/Vol-3793/paper_40.pdf
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 2024 xAI late-breaking work, demos and doctoral consortium (xAI-2024: LB/D/DC) |
Start Date | Jul 17, 2024 |
End Date | Jul 19, 2024 |
Acceptance Date | Apr 5, 2024 |
Online Publication Date | Sep 10, 2024 |
Publication Date | Oct 19, 2024 |
Deposit Date | Nov 15, 2024 |
Publicly Available Date | Nov 15, 2024 |
Publisher | CEUR-WS |
Peer Reviewed | Peer Reviewed |
Volume | 3793 |
Series Title | CEUR workshop proceedings |
Series ISSN | 1613-0073 |
Keywords | Case-based reasoning; Personalised explanation experiences; Explainer library; Evaluation cockpit; Explanation experiences editor; XAI chatbot; XAI ontology |
Public URL | https://rgu-repository.worktribe.com/output/2578334 |
Publisher URL | https://ceur-ws.org/Vol-3793/ |
Files
CARO-MARTINEZ 2024 Building personalised XAI (VOR)
(1.2 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
Copyright Statement
© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
You might also like
iSee: a case-based reasoning platform for the design of explanation experiences.
(2024)
Journal Article
iSee: demonstration video. [video recording]
(2023)
Digital Artefact
Clinical dialogue transcription error correction using Seq2Seq models.
(2022)
Preprint / Working Paper
Downloadable Citations
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search