McIntire, J. P., Havig, P. R., Watamaniuk, S. N., & Gilkey, R. H. (2010). Visual search performancke with 3-D auditory cues: Effects of motion, target location, and practice. Human Factors: The Journal of the Human Factors and Ergonomics Society.
This was an interesting paper because it was the first to approach the facilitator effects of 3D audio cues applied to moving stimuli.
A lot of previous research has demonstrated that 3D audio does reduce the time of visual search, the subjective workload, etc. However, to the authors knowledge, all research on the subject was made using static targets among static distractors, and this paper addresses this gap by using an environment with dynamic stimuli.
The set-up was similar to a regular searching task. The participant’s head movements were monitored via a head-tracking system. The auditori stimuli were presented via headphones. The sound cue consisted of three consecutive 50-ms bursts of wideband white Gaussian noise separated by 25-ms gaps of silence and ending with 250 ms of silence, totaling 450 ms. The sample rate was 44,100 samples/s. The cue was repeated during each trial in the auditory conditions until a response was given and was presented at a comfortable listening level, approximately 50 to 60 dB SPL. The sound cue was filtered with the use of a generic set of head-related transfer functions (HRTFs) in the National Aeronautics and Space Administration’s Sound Lab software (see SLAB, n.d.; also see Miller & Wenzel, 2002).
When coupled with the head-tracking sys-tem, the SLAB software rendered the auditory cue so that it was collocated with the visual tar-get (dynamic or static) regardless of where the participant’s head was pointed.
During the task, the participants has to look at a display with 15 distractors and 1 target, find the target, and respond in which side of the target was the stimuli gap – it was a two-alternative forced choice procedure.
There were four conditions to the experiment: 1. static environment with no audio cues; 2. static environment with 3D audio cues; 3. dynamic environment with no audio cues and; 4. dynamic environment with 3D audio cues. In totality, the experiment recorded 2,816 trials per participant: 4 (sessions) × 4 (con- ditions) × 176 (trials per condition).
From the results, it was clear that conducting visual searches in moving environments was more difficult than searches in static environments, regardless of whether 3-D auditory cues were present. The significant main effect of auditory cue indicates that average search times decrease when 3-D auditory cues are provided, regardless of the search environment. The auditory cues reduced overall search times by an average of 430 ms (from 1,800 to 1,370 ms), an improvement of 24%.
No practice effects involving 3-D audio were found, and the beneficial effects of 3-D audio were evident in the first experimental session, supporting its purported ease of use.
As for the angle of the starting location of the target, it had a strong effect on search times and on the effectiveness of 3-D audio. Search times were generally faster when targets were located closer to the fixation point (smaller eccentricities) and when located on the horizontal plane. Importantly, 3-D audio provided the largest benefits to search performance when the target appeared at farther eccentricities and/or on the horizontal plane.
As a conclusion, the author write some ideas for future research. Objects in the real world often move on nonlinear paths in three dimensions and may appear anywhere in the spatial environment. So it may be appropriate to examine different types of motion and search-field sizes in future research.