Skip to main content

Research Repository

Advanced Search

Gulfs of expectation: eliciting and verifying differences in trust expectations using personas

Faily, Shamal; Power, David; Fl�chais, Ivan

Authors

Shamal Faily

David Power

Ivan Fl�chais



Abstract

Personas are a common tool used in Human Computer Interaction to represent the needs and expectations of a system's stakeholders, but they are also grounded in large amounts of qualitative data. Our aim is to make use of this data to anticipate the differences between a user persona's expectations of a system, and the expectations held by its developers. This paper introduces the idea of gulfs of expectation – the gap between the expectations held by a user about a system and its developers, and the expectations held by a developer about the system and its users. By evaluating these differences in expectation against a formal representation of a system, we demonstrate how differences between the anticipated user and developer mental models of the system can be verified. We illustrate this using a case study where persona characteristics were analysed to identify divergent behaviour and potential security breaches as a result of differing trust expectations.

Citation

FAILY, S., POWER, D. and FLÉCHAIS, I. 2016. Gulfs of expectation: eliciting and verifying differences in trust expectations using personas. Journal of trust management [online], 3, article number 4. Available from: https://doi.org/10.1186/s40493-016-0025-9

Journal Article Type Article
Acceptance Date Jul 21, 2016
Online Publication Date Jul 29, 2016
Publication Date Dec 31, 2016
Deposit Date Sep 29, 2021
Publicly Available Date Dec 7, 2021
Journal Journal of trust management
Print ISSN 2196-064X
Electronic ISSN 2196-064X
Publisher Springer
Peer Reviewed Peer Reviewed
Volume 3
Article Number 4
DOI https://doi.org/10.1186/s40493-016-0025-9
Keywords User personas; User-centred design; Systems security; Security risk analysis; Human-computer interaction (HCI)
Public URL https://rgu-repository.worktribe.com/output/1427949

Files




You might also like



Downloadable Citations