Most people can't read lips. If you turn down the sound on your television, you can see why it is difficule. Unless trained, if you see someone speak a sentence without the accompanying sounds, you are unlikely to recognize many words but it turns out people can lip-read themselves better than they can lip-read others, and that shows an interesting link between speech perception and speech production.

Nancy Tye-Murray and colleagues from Washington University developed simple, nonsensical sentences from word boards, e.g. The duck watched the boy and The snail watched the goose, so that participants would easily identify and recognize individual words. Twenty adults recorded the sentences and, after several weeks, lip-read silent video clips with sentences spoken both by themselves and by nine other participants.

Participants were able to lip-read video clips of themselves consistently more accurately than video clips of others. These findings suggest that seeing someone speak activates speech processes that link 'seen' words to 'actual' words in the mental lexicon, and the activation is particularly strong when you see yourself speak.

The authors conclude: "This study is one of the first to show that not only can people recognize their own actions from those of others, but they can better interpret their own actions. A strong link may exist between how we perform actions and how we perceive actions; that is, we may activate some of the very same mental representations when performing and when perceiving. These findings have important implications for understanding how we learn new actions and, particularly, for how we learn to recognize and produce speech."

Citation: Tye-Murray N et al (2012). Reading your own lips: common-coding theory and visual speech perception. Psychonomic Bulletin&Review; DOI 10.3758/s13423-012-0328-5