Skip to main content

Research Repository

Advanced Search

All Outputs (11)

Clinical dialogue transcription error correction with self-supervision. (2023)
Conference Proceeding
NANAYAKKARA, G., WIRATUNGA, N., CORSAR, D., MARTIN, K. and WIJEKOON, A. 2023. Clinical dialogue transcription error correction with self-supervision. In Bramer, M. and Stahl, F. (eds.) Artificial intelligence XL: proceedings of the 43rd SGAI international conference on artificial intelligence (AI-2023), 12-14 December 2023, Cambridge, UK. Lecture notes in computer science, 14381. Cham: Springer [online], pages 33-46. Available from: https://doi.org/10.1007/978-3-031-47994-6_3

A clinical dialogue is a conversation between a clinician and a patient to share medical information, which is critical in clinical decision-making. The reliance on manual note-taking is highly inefficient and leads to transcription errors when digit... Read More about Clinical dialogue transcription error correction with self-supervision..

Towards feasible counterfactual explanations: a taxonomy guided template-based NLG method. (2023)
Conference Proceeding
SALIMI, P., WIRATUNGA, N., CORSAR, D. and WIJEKOON, A. 2023. Towards feasible counterfactual explanations: a taxonomy guided template-based NLG method. In Gal, K., Nowé, A., Nalepa, G.J., Fairstein, R. and Rădulescu, R. (eds.) ECAI 2023: proceedings of the 26th European conference on artificial intelligence (ECAI 2023), 30 September - 4 October 2023, Kraków, Poland. Frontiers in artificial intelligence and applications, 372. Amsterdam: IOS Press [online], pages 2057-2064. Available from: https://doi.org/10.3233/FAIA230499

Counterfactual Explanations (cf-XAI) describe the smallest changes in feature values necessary to change an outcome from one class to another. However, many cf-XAI methods neglect the feasibility of those changes. In this paper, we introduce a novel... Read More about Towards feasible counterfactual explanations: a taxonomy guided template-based NLG method..

CBR driven interactive explainable AI. (2023)
Conference Proceeding
WIJEKOON, A., WIRATUNGA, N., MARTIN, K., CORSAR, D., NKISI-ORJI, I., PALIHAWADANA, C., BRIDGE, D., PRADEEP, P., AGUDO, B.D. and CARO-MARTÍNEZ, M. 2023. CBR driven interactive explainable AI. In MASSIE, S. and CHAKRABORTI, S. (eds.) 2023. Case-based reasoning research and development: proceedings of the 31st International conference on case-based reasoning 2023, (ICCBR 2023), 17-20 July 2023, Aberdeen, UK. Lecture notes in computer science (LNCS), 14141. Cham: Springer [online], pages169-184. Available from: https://doi.org/10.1007/978-3-031-40177-0_11

Explainable AI (XAI) can greatly enhance user trust and satisfaction in AI-assisted decision-making processes. Numerous explanation techniques (explainers) exist in the literature, and recent findings suggest that addressing multiple user needs requi... Read More about CBR driven interactive explainable AI..

Failure-driven transformational case reuse of explanation strategies in CloodCBR. (2023)
Conference Proceeding
NKISI-ORJI, I., PALIHAWADANA, C., WIRATUNGA, N., WIJEKOON, A. and CORSAR, D. 2023. Failure-driven transformational case reuse of explanation strategies in CloodCBR. In Massie, S. and Chakraborti, S. (eds.) Case-based reasoning research and development: proceedings of the 31st International conference on case-based reasoning 2023 (ICCBR 2023), 17-20 July 2023, Aberdeen, UK. Lecture notes in computer science (LNCS), 14141. Cham: Springer [online], pages 279-293. Available from: https://doi.org/10.1007/978-3-031-40177-0_18

In this paper, we propose a novel approach to improve problem-solving efficiency through the reuse of case solutions. Specifically, we introduce the concept of failure-driven transformational case reuse of explanation strategies, which involves trans... Read More about Failure-driven transformational case reuse of explanation strategies in CloodCBR..

The current and future role of visual question answering in eXplainable artificial intelligence. (2023)
Conference Proceeding
CARO-MARTINEZ, M., WIJEKOON, A., DIAZ-AGUDO, B. and RECIO-GARCIA, J.A. 2023. The current and future role of visual question answering in eXplainable artificial intelligence. In Malburg, L. and Verma, D. (eds.) Workshop proceedings of the 31st International conference on case-based reasoning (ICCBR-WS 2023), 17 July 2023, Aberdeen, UK. CEUR workshop proceedings, 3438. Aachen: CEUR-WS [online], pages 172-183. Available from: https://ceur-ws.org/Vol-3438/paper_13.pdf

Over the last few years, we have seen how the interest of the computer science research community on eXplainable Artificial Intelligence has grown in leaps and bounds. The reason behind this rise is the use of Artificial Intelligence in many daily li... Read More about The current and future role of visual question answering in eXplainable artificial intelligence..

AGREE: a feature attribution aggregation framework to address explainer disagreements with alignment metrics. (2023)
Conference Proceeding
PIRIE, C., WIRATUNGA, N., WIJEKOON, A. and MORENO-GARCIA, C.F. 2023. AGREE: a feature attribution aggregation framework to address explainer disagreements with alignment metrics. In Malburg, L. and Verma, D. (eds.) Workshop proceedings of the 31st International conference on case-based reasoning (ICCBR-WS 2023), 17 July 2023, Aberdeen, UK. CEUR workshop proceedings, 3438. Aachen: CEUR-WS [online], pages 184-199. Available from: https://ceur-ws.org/Vol-3438/paper_14.pdf

As deep learning models become increasingly complex, practitioners are relying more on post hoc explanation methods to understand the decisions of black-box learners. However, there is growing concern about the reliability of feature attribution expl... Read More about AGREE: a feature attribution aggregation framework to address explainer disagreements with alignment metrics..

Machine learning for risk stratification of diabetic foot ulcers using biomarkers. (2023)
Conference Proceeding
MARTIN, K., UPHADYAY, A., WIJEKOON, A., WIRATUNGA, N. and MASSIE, S. [2023]. Machine learning for risk stratification of diabetic foot ulcers using biomarkers. To be presented at the 2023 International conference on computational science (ICCS 2023): computing at the cutting edge of science, 3-5 July 2023, Prague, Czech Republic: [virtual event].

Development of a Diabetic Foot Ulcer (DFU) causes a sharp decline in a patient's health and quality of life. The process of risk stratification is crucial for informing the care that a patient should receive to help manage their Diabetes before an ul... Read More about Machine learning for risk stratification of diabetic foot ulcers using biomarkers..

Conceptual modelling of explanation experiences through the iSeeonto ontology. (2023)
Conference Proceeding
CARO-MARTÍNEZ, M., WIJEKOON, A., RECIO-GARCÍA, J.A., CORSAR, D. and NKISI-ORJI, I. 2022. Conceptual modelling of explanation experiences through the iSeeonto ontology. In Reuss, P. and Schönborn, J. (eds.) Workshop proceedings of the 30th International conference on case-based reasoning (ICCBR-WS 2022), 12-15 September 2022, Nancy, France. CEUR workshop proceedings, 3389. Aachen: CEUR-WS [online], pages 117-128. Available from: https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_86.pdf

Explainable Artificial Intelligence is a big research field required in many situations where we need to understand Artificial Intelligence behaviour. However, each explanation need is unique which makes it difficult to apply explanation techniques a... Read More about Conceptual modelling of explanation experiences through the iSeeonto ontology..

iSee: intelligent sharing of explanation experiences. (2023)
Conference Proceeding
MARTIN, K., WIJEKOON, A., WIRATUNGA, N., PALIHAWADANA, C., NKISI-ORJI, I., CORSAR, D., DÍAZ-AGUDO, B., RECIO-GARCÍA, J.A., CARO-MARTÍNEZ, M., BRIDGE, D., PRADEEP, P., LIRET, A. and FLEISCH, B. 2022. iSee: intelligent sharing of explanation experiences. In Reuss, P. and Schönborn, J. (eds.) Workshop proceedings of the 30th International conference on case-based reasoning (ICCBR-WS 2022), 12-15 September 2022, Nancy, France. CEUR workshop proceedings, 3389. Aachen: CEUR-WS [online], pages 231-232. Available from: https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_83.pdf

The right to an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. However, different system stakeholders may have different background knowledge, competencies and goals, thus requiring different kinds of ex... Read More about iSee: intelligent sharing of explanation experiences..

Introducing Clood CBR: a cloud based CBR framework. (2023)
Conference Proceeding
PALIHAWADANA, C., NKISI-ORJI, I., WIRATUNGA, N., CORSAR, D. and WIJEKOON, A. 2022. Introducing Clood CBR: a cloud based CBR framework. In Reuss, P. and Schönborn, J. (eds.) Workshop proceedings of the 30th International conference on case-based reasoning (ICCBR-WS 2022), 12-15 September 2022, Nancy, France. CEUR workshop proceedings, 3389. Aachen: CEUR-WS [online], pages 233-234. Available from: https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_108.pdf

CBR applications have been deployed in a wide range of sectors, from pharmaceuticals; to defence and aerospace to IoT and transportation, to poetry and music generation; for example. However, a majority of applications have been built using monolithi... Read More about Introducing Clood CBR: a cloud based CBR framework..

iSee: intelligent sharing of explanation experience of users for users. (2023)
Conference Proceeding
WIJEKOON, A., WIRATUNGA, N., PALIHAWADANA, C., NKISI-ORJI, I., CORSAR, D. and MARTIN, K. 2023. iSee: intelligent sharing of explanation experience of users for users. In IUI '23 companion: companion proceedings of the 28th Intelligent user interfaces international conference 2023 (IUI 2023), 27-31 March 2023, Sydney, Australia. New York: ACM [online], pages 79-82. Available from: https://doi.org/10.1145/3581754.3584137

The right to obtain an explanation of the decision reached by an Artificial Intelligence (AI) model is now an EU regulation. Different stakeholders of an AI system (e.g. managers, developers, auditors, etc.) may have different background knowledge, c... Read More about iSee: intelligent sharing of explanation experience of users for users..