It is likely that Autonomous Vehicles (AVs) will have significant social, cultural, spatial and environmental implications and the interaction between humans, automated vehicles and physical environment will provide an array of challenges. This paper aims to explore use of innovative visualisation approaches, to communicate and foster discussion to anticipate possible scenarios involving AVs. It is argued that such an approach might be used to help conceptualise human experiences with the potential to enhance public engagement and understanding of the complex human-machine associations and open a dialogue with potential end users. Presenting the journeys from different perspectives and reconceptualising the context through the eyes of AVs emphasised the nuances of experience between the machines, urban space and human bodies. Unexpected user-technology interactions and experiences will emerge as humans are not always sensible and passive followers and can be apprehensive when it comes to accepting such a novel technology as self-driving vehicles on the roads. The focus applied in the methodology and data capture was on inclusivity of data, and an aspiration to capture not only movement but also noise and human experience of a space. The integration of AVs on public roads will rely on technical innovation to ensure that vehicles can safely operate in a practical sense yet, the study of the perceptual and ethical effects of new technology and potential influences on society via engaging the general public in the process will help to manage expectations and create platforms for mutual learning.
BELKOURI, D., LAING, R. and GRAY, D. 2022. Through the eyes of autonomous vehicles: using laser scanning technology to engage the public via the analysis of journeys seen from a different perspective. Transportation research procedia [online], 60: proceedings of the 25th Living and walking in cities international conference 2021 (LWC 2021): new scenarios for safe mobility in urban areas, 9-10 September 2021, Brescia, Italy, pages 496-503. Available from: https://doi.org/10.1016/j.trpro.2021.12.064