File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Auditory display design for exploration in mobile audio-augmented reality

Author(s)
Vazquez-Alvarez, YolandaOakley, IanBrewster, Stephen A.
Issued Date
2012-12
DOI
10.1007/s00779-011-0459-0
URI
https://scholarworks.unist.ac.kr/handle/201301/8423
Fulltext
https://link.springer.com/article/10.1007%2Fs00779-011-0459-0
Citation
PERSONAL AND UBIQUITOUS COMPUTING, v.16, no.8, pp.987 - 999
Abstract
In this paper, we compare four different auditory displays in a mobile audio-augmented reality environment (a sound garden). The auditory displays varied in the use of non-speech audio, Earcons, as auditory landmarks and 3D audio spatialization, and the goal was to test the user experience of discovery in a purely exploratory environment that included multiple simultaneous sound sources. We present quantitative and qualitative results from an initial user study conducted in the Municipal Gardens of Funchal, Madeira. Results show that spatial audio together with Earcons allowed users to explore multiple simultaneous sources and had the added benefit of increasing the level of immersion in the experience. In addition, spatial audio encouraged a more exploratory and playful response to the environment. An analysis of the participants' logged data suggested that the level of immersion can be related to increased instances of stopping and scanning the environment, which can be quantified in terms of walking speed and head movement.
Publisher
SPRINGER LONDON LTD
ISSN
1617-4909
Keyword (Author)
Sound gardenSpatial audioAuditory displaysEyes-free interactionMobile audio-augmented realityExploratory environments
Keyword
NAVIGATION PERFORMANCE

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.