In: Psychology
Experiments designed and evaluated by holger and eva to determine how visual information influences speech perception:
Two experiments examined the time course of the use of auditory and visual speech cues to spoken word recognition using an eye-tracking paradigm. Results of a first experiment showed that the use of visual speech cues from lip-reading is reduced if concurrently presented pictures require a division of attentional resources. This reduction was evident even when listeners’ eye gaze was on the speaker rather than the (static) pictures. Experiment 2 used a deictic hand gesture to foster attention to the speaker. At the same time the visual processing load was reduced by keeping the visual display constant over a fixed number of successivetrials. Under these conditions, the visual speech cues from lip-reading were used.
Moreover, the eye-tracking data indicated that visual information was used immediately and even earlier than auditory information. In combination, these data indicate that visual speech cues are not used automatically, but if they are used they are used immediately.