Mccann, R. S., Foyle, D. C., & Johnston, J. C. (1993). Attentional Limitations with Head-up Displays. Proceedings of the Seventh International Symposium on Aviation Psychology(pp. 70–75).
In 1993 McCann and collaborators wanted to prove this effect was verified when using a HUD on an aviation task. The authors expected to find that visual attention could be focused on either HUD or on the world beyond them, but not on both simultaneously. Their critique went to the “blind” application of HUD to the plane, without considering important human factors from whose perspective there could be some problems with the parallel processing assumption.
According to the authors, there are three perceptual cues of the HUD (bear in mind we are in 1993) that could help to distinguish its symbology from the world:
1) HUDs are stationary;
2) HUDs are generally drawn in highly saturated green;
3) HUDs are oriented vertically with respect to the eye plane.
Since these cues are distinguishable, the visual system is likely to group the information transmitted by the HUD as a perceptual group, and the world as another. This may have a very negative influence on the main task of piloting or driving: objects in the world may not be processed in parallel with HUD symbology, and transitioning between the HUD and the world may be slowed down by the requirement to shift attention between groups.
So their study had two goals: the first goal was to test the hypothesis that the visual system parses HUDs and the world as separate perceptual groups, so that when attention is focused on the HUD (world), objects in the world (HUD) are excluded from processing The second goal was to determine whether transitioning from the HUD to the world (and from the world to the HUD) requires a shift of attention.
The task was a detection task in a low fidelity approach to a runway where participants had to find a target (a stop sign or a diamond sign). Before the potential targets appeared – three geometric signs on the HUD and other three on the runway – the participant was alerted to whether the relevant set of geometric symbols could appear on the HUD or on the runway. Participants were told that if the relevant target was a stop sign, the runway was closed, and they should signal their intention to do a go-around by striking the upper key as quickly as possible. Alternatively, if the relevant target was a diamond, the runway was open, and they should signal their intention to continue the landing by pressing the lower key as rapidly as possible.
The results supported both hypotheses tested: When subjects focused on the HUD for the duration of the trial, there was little effect of conflicting information in the world. Similarly, when subjects focused on the world for the duration of the trial, there was little influence of conflicting information on the HUD. These results add to previous findings that when pilots focus attention on the HUD, objects in the world are excluded from processing. When the targets where located in an incongruent situation, the shift costs in transitioning from one group to the other were as much as 150 msec (for HUD- to-runway transitions).
The authors suggest some design implications for future HUDs. Since HUDs do not seem to eliminate transition times between instrument processing and world processing, future HUDs should be developed with an eye toward removing the cues that cause the visual system to segregate HUDs from the world. For example, suppose that perspective cues and differential motion cues are in large part responsible for the segregation. The problem could be attenuated by designing HUD symbology to be as conformal as possible with the out-the-world scene.