Professor Nirmalie Wiratunga n.wiratunga@rgu.ac.uk
Associate Dean for Research
Professor Nirmalie Wiratunga n.wiratunga@rgu.ac.uk
Associate Dean for Research
Dr Kyle Martin k.martin3@rgu.ac.uk
Lecturer
Dr Anjana Wijekoon
Dr David Corsar d.corsar1@rgu.ac.uk
Senior Lecturer
iSee: advancing multi-shot explainable AI using case-based recommendations. (2024)
Presentation / Conference Contribution
WIJEKOON, A., WIRATUNGA, N., CORSAR, D., MARTIN, K., NKISI-ORJI, I., PALIHAWADANA, C., CARO-MARTÍNEZ, M., DÍAZ-AGUDO, B., BRIDGE, D. and LIRET, A. 2024. iSee: advancing multi-shot explainable AI using case-based recommendations. In Endriss, U., Melo, F.S., Bach, K., et al. (eds.) ECAI 2024: proceedings of the 27th European conference on artificial intelligence, co-located with the 13th conference on Prestigious applications of intelligent systems (PAIS 2024), 19–24 October 2024, Santiago de Compostela, Spain. Frontiers in artificial intelligence and applications, 392. Amsterdam: IOS Press [online], pages 4626-4633. Available from: https://doi.org/10.3233/FAIA241057Explainable AI (XAI) can greatly enhance user trust and satisfaction in AI-assisted decision-making processes. Recent findings suggest that a single explainer may not meet the diverse needs of multiple users in an AI system; indeed, even individual u... Read More about iSee: advancing multi-shot explainable AI using case-based recommendations..
Building personalised XAI experiences through iSee: a case-based reasoning-driven platform. (2024)
Presentation / Conference Contribution
CARO-MARTÍNEZ, M., LIRET, A., DÍAZ-AGUDO, B., RECIO-GARCÍA, J.A., DARIAS, J., WIRATUNGA, N., WIJEKOON, A., MARTIN, K., NKISI-ORJI, I., CORSAR, D., PALIHAWADANA, C., PIRIE, C., BRIDGE, D., PRADEEP, P. and FLEISCH, B. 2024. Building personalised XAI experiences through iSee: a case-based reasoning-driven platform. In Longo, L., Liu, W. and Montavon, G. (eds.) xAI-2024: LB/D/DC: joint proceedings of the xAI 2024 late-breaking work, demos and doctoral consortium, co-located with the 2nd World conference on eXplainable artificial intelligence (xAI 2024), 17-19 July 2024, Valletta, Malta. Aachen: CEUR-WS [online], 3793, pages 313-320. Available from: https://ceur-ws.org/Vol-3793/paper_40.pdfNowadays, eXplainable Artificial Intelligence (XAI) is well-known as an important field in Computer Science due to the necessity of understanding the increasing complexity of Artificial Intelligence (AI) systems or algorithms. This is the reason why... Read More about Building personalised XAI experiences through iSee: a case-based reasoning-driven platform..
iSee: a case-based reasoning platform for the design of explanation experiences. (2024)
Journal Article
CARO-MARTÍNEZ, M., RECIO-GARCÍA, J.A., DÍAZ-AGUDO, B., DARIAS, J.M., WIRATUNGA, N., MARTIN, K., WIJEKOON, A., NKISI-ORJI, I., CORSAR, D., PRADEEP, P., BRIDGE, D. and LIRET, A. 2024. iSee: a case-based reasoning platform for the design of explanation experiences. Knowledge-based systems [online], 302, article number 112305. Available from: https://doi.org/10.1016/j.knosys.2024.112305Explainable Artificial Intelligence (XAI) is an emerging field within Artificial Intelligence (AI) that has provided many methods that enable humans to understand and interpret the outcomes of AI systems. However, deciding on the best explanation app... Read More about iSee: a case-based reasoning platform for the design of explanation experiences..
CBR driven interactive explainable AI. (2023)
Presentation / Conference Contribution
WIJEKOON, A., WIRATUNGA, N., MARTIN, K., CORSAR, D., NKISI-ORJI, I., PALIHAWADANA, C., BRIDGE, D., PRADEEP, P., AGUDO, B.D. and CARO-MARTÍNEZ, M. 2023. CBR driven interactive explainable AI. In MASSIE, S. and CHAKRABORTI, S. (eds.) 2023. Case-based reasoning research and development: proceedings of the 31st International conference on case-based reasoning 2023, (ICCBR 2023), 17-20 July 2023, Aberdeen, UK. Lecture notes in computer science (LNCS), 14141. Cham: Springer [online], pages169-184. Available from: https://doi.org/10.1007/978-3-031-40177-0_11Explainable AI (XAI) can greatly enhance user trust and satisfaction in AI-assisted decision-making processes. Numerous explanation techniques (explainers) exist in the literature, and recent findings suggest that addressing multiple user needs requi... Read More about CBR driven interactive explainable AI..
iSee: intelligent sharing of explanation experiences. (2023)
Presentation / Conference Contribution
MARTIN, K., WIJEKOON, A., WIRATUNGA, N., PALIHAWADANA, C., NKISI-ORJI, I., CORSAR, D., DÍAZ-AGUDO, B., RECIO-GARCÍA, J.A., CARO-MARTÍNEZ, M., BRIDGE, D., PRADEEP, P., LIRET, A. and FLEISCH, B. 2022. iSee: intelligent sharing of explanation experiences. In Reuss, P. and Schönborn, J. (eds.) Workshop proceedings of the 30th International conference on case-based reasoning (ICCBR-WS 2022), 12-15 September 2022, Nancy, France. CEUR workshop proceedings, 3389. Aachen: CEUR-WS [online], pages 231-232. Available from: https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_83.pdfThe right to an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. However, different system stakeholders may have different background knowledge, competencies and goals, thus requiring different kinds of ex... Read More about iSee: intelligent sharing of explanation experiences..
iSee: intelligent sharing of explanation experience of users for users. (2023)
Presentation / Conference Contribution
WIJEKOON, A., WIRATUNGA, N., PALIHAWADANA, C., NKISI-ORJI, I., CORSAR, D. and MARTIN, K. 2023. iSee: intelligent sharing of explanation experience of users for users. In IUI '23 companion: companion proceedings of the 28th Intelligent user interfaces international conference 2023 (IUI 2023), 27-31 March 2023, Sydney, Australia. New York: ACM [online], pages 79-82. Available from: https://doi.org/10.1145/3581754.3584137The right to obtain an explanation of the decision reached by an Artificial Intelligence (AI) model is now an EU regulation. Different stakeholders of an AI system (e.g. managers, developers, auditors, etc.) may have different background knowledge, c... Read More about iSee: intelligent sharing of explanation experience of users for users..
iSee: demonstration video. [video recording] (2023)
Digital Artefact
WIJEKOON, A., WIRATUNGA, N., PALIHAWADANA, C., NKISI-ORJI, I., CORSAR, D. and MARTIN, K. 2023. iSee: demonstration video. [video recording]. New York: ACM [online]. Available from: https://dl.acm.org/doi/10.1145/3581754.3584137#sec-suppThis output presents a demonstration of the iSee platform. iSee is an ongoing project aimed at improving the user experience of AI by harnessing experiences and best practices in Explainable AI. To this end, iSee brings together research and developm... Read More about iSee: demonstration video. [video recording].
How close is too close? Role of feature attributions in discovering counterfactual explanations. (2022)
Presentation / Conference Contribution
WIJEKOON, A., WIRATUNGA, N., NKISI-ORJI, I., PALIHAWADANA, C., CORSAR, D. and MARTIN, K. 2022. How close is too close? Role of feature attributions in discovering counterfactual explanations. In Keane, M.T. and Wiratunga, N. (eds.) Case-based reasoning research and development: proceedings of the 30th International conference on case-based reasoning (ICCBR 2022), 12-15 September 2022, Nancy, France. Lecture notes in computer science, 13405. Cham: Springer [online], pages 33-47. Available from: https://doi.org/10.1007/978-3-031-14923-8_3Counterfactual explanations describe how an outcome can be changed to a more desirable one. In XAI, counterfactuals are "actionable" explanations that help users to understand how model decisions can be changed by adapting features of an input. A cas... Read More about How close is too close? Role of feature attributions in discovering counterfactual explanations..
DisCERN: discovering counterfactual explanations using relevance features from neighbourhoods. (2021)
Presentation / Conference Contribution
WIRATUNGA, N., WIJEKOON, A., NKISI-ORJI, I., MARTIN, K., PALIHAWADANA, C. and CORSAR, D. 2021. DisCERN: discovering counterfactual explanations using relevance features from neighbourhoods. In Proceedings of 33rd IEEE (Institute of Electrical and Electronics Engineers) International conference on tools with artificial intelligence 2021 (ICTAI 2021), 1-3 November 2021, Washington, USA [virtual conference]. Piscataway: IEEE [online], pages 1466-1473. Available from: https://doi.org/10.1109/ICTAI52525.2021.00233Counterfactual explanations focus on 'actionable knowledge' to help end-users understand how a machine learning outcome could be changed to a more desirable outcome. For this purpose a counterfactual explainer needs to discover input dependencies tha... Read More about DisCERN: discovering counterfactual explanations using relevance features from neighbourhoods..
Actionable feature discovery in counterfactuals using feature relevance explainers. (2021)
Presentation / Conference Contribution
WIRATUNGA, N., WIJEKOON, A., NKISI-ORJI, I., MARTIN, K., PALIHAWADANA, C. and CORSAR, D. 2021. Actionable feature discovery in counterfactuals using feature relevance explainers. In Borck, H., Eisenstadt, V., Sánchez-Ruiz, A. and Floyd, M. (eds.) Workshop proceedings of the 29th International conference on case-based reasoning (ICCBR-WS 2021), 13-16 September 2021, [virtual event]. CEUR workshop proceedings, 3017. Aachen: CEUR-WS [online], pages 63-74. Available from: http://ceur-ws.org/Vol-3017/101.pdfCounterfactual explanations focus on 'actionable knowledge' to help end-users understand how a Machine Learning model outcome could be changed to a more desirable outcome. For this purpose a counterfactual explainer needs to be able to reason with si... Read More about Actionable feature discovery in counterfactuals using feature relevance explainers..
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search