Listening to talking faces: Motor cortical activation during speech perception

Jeremy I. Skipper, Howard C. Nusbaum, Steven L. Small

Research output: Contribution to journalArticle

203 Citations (Scopus)

Abstract

Neurophysiological research suggests that understanding the actions of others harnesses neural circuits that would be used to produce those actions directly. We used fMRI to examine brain areas active during language comprehension in which the speaker was seen and heard while talking (audiovisual) or heard but not seen (audio-alone) or when the speaker was seen talking with the audio track removed (video-alone). We found that audiovisual speech perception activated a network of brain regions that included cortical motor areas involved in planning and executing speech production and areas subserving proprioception related to speech production. These regions included the posterior part of the superior temporal gyrus and sulcus, the pars opercularis, premotor cortex, adjacent primary motor cortex, somatosensory cortex, and the cerebellum. Activity in premotor cortex and posterior superior temporal gyrus and sulcus was modulated by the amount of visually distinguishable phonemes in the stories. None of these regions was activated to the same extent in the audio- or video-alone conditions. These results suggest that integrating observed facial movements into the speech perception process involves a network of multimodal brain regions associated with speech production and that these areas contribute less to speech perception when only auditory signals are present. This distributed network could participate in recognition processing by interpreting visual information about mouth movements as phonetic information based on motor commands that could have generated those movements.

Original languageEnglish (US)
Pages (from-to)76-89
Number of pages14
JournalNeuroImage
Volume25
Issue number1
DOIs
StatePublished - Mar 2005
Externally publishedYes

Fingerprint

Speech Perception
Motor Cortex
Temporal Lobe
Brain
Proprioception
Phonetics
Somatosensory Cortex
Cerebellum
Mouth
Language
Magnetic Resonance Imaging
Research
Wernicke Area

Keywords

  • Motor cortical activation
  • Speech perception
  • Talking faces

ASJC Scopus subject areas

  • Neurology
  • Cognitive Neuroscience

Cite this

Listening to talking faces : Motor cortical activation during speech perception. / Skipper, Jeremy I.; Nusbaum, Howard C.; Small, Steven L.

In: NeuroImage, Vol. 25, No. 1, 03.2005, p. 76-89.

Research output: Contribution to journalArticle

Skipper, Jeremy I. ; Nusbaum, Howard C. ; Small, Steven L. / Listening to talking faces : Motor cortical activation during speech perception. In: NeuroImage. 2005 ; Vol. 25, No. 1. pp. 76-89.
@article{4adadde8bc35413790cd51f5f6da2afd,
title = "Listening to talking faces: Motor cortical activation during speech perception",
abstract = "Neurophysiological research suggests that understanding the actions of others harnesses neural circuits that would be used to produce those actions directly. We used fMRI to examine brain areas active during language comprehension in which the speaker was seen and heard while talking (audiovisual) or heard but not seen (audio-alone) or when the speaker was seen talking with the audio track removed (video-alone). We found that audiovisual speech perception activated a network of brain regions that included cortical motor areas involved in planning and executing speech production and areas subserving proprioception related to speech production. These regions included the posterior part of the superior temporal gyrus and sulcus, the pars opercularis, premotor cortex, adjacent primary motor cortex, somatosensory cortex, and the cerebellum. Activity in premotor cortex and posterior superior temporal gyrus and sulcus was modulated by the amount of visually distinguishable phonemes in the stories. None of these regions was activated to the same extent in the audio- or video-alone conditions. These results suggest that integrating observed facial movements into the speech perception process involves a network of multimodal brain regions associated with speech production and that these areas contribute less to speech perception when only auditory signals are present. This distributed network could participate in recognition processing by interpreting visual information about mouth movements as phonetic information based on motor commands that could have generated those movements.",
keywords = "Motor cortical activation, Speech perception, Talking faces",
author = "Skipper, {Jeremy I.} and Nusbaum, {Howard C.} and Small, {Steven L.}",
year = "2005",
month = "3",
doi = "10.1016/j.neuroimage.2004.11.006",
language = "English (US)",
volume = "25",
pages = "76--89",
journal = "NeuroImage",
issn = "1053-8119",
publisher = "Academic Press Inc.",
number = "1",

}

TY - JOUR

T1 - Listening to talking faces

T2 - Motor cortical activation during speech perception

AU - Skipper, Jeremy I.

AU - Nusbaum, Howard C.

AU - Small, Steven L.

PY - 2005/3

Y1 - 2005/3

N2 - Neurophysiological research suggests that understanding the actions of others harnesses neural circuits that would be used to produce those actions directly. We used fMRI to examine brain areas active during language comprehension in which the speaker was seen and heard while talking (audiovisual) or heard but not seen (audio-alone) or when the speaker was seen talking with the audio track removed (video-alone). We found that audiovisual speech perception activated a network of brain regions that included cortical motor areas involved in planning and executing speech production and areas subserving proprioception related to speech production. These regions included the posterior part of the superior temporal gyrus and sulcus, the pars opercularis, premotor cortex, adjacent primary motor cortex, somatosensory cortex, and the cerebellum. Activity in premotor cortex and posterior superior temporal gyrus and sulcus was modulated by the amount of visually distinguishable phonemes in the stories. None of these regions was activated to the same extent in the audio- or video-alone conditions. These results suggest that integrating observed facial movements into the speech perception process involves a network of multimodal brain regions associated with speech production and that these areas contribute less to speech perception when only auditory signals are present. This distributed network could participate in recognition processing by interpreting visual information about mouth movements as phonetic information based on motor commands that could have generated those movements.

AB - Neurophysiological research suggests that understanding the actions of others harnesses neural circuits that would be used to produce those actions directly. We used fMRI to examine brain areas active during language comprehension in which the speaker was seen and heard while talking (audiovisual) or heard but not seen (audio-alone) or when the speaker was seen talking with the audio track removed (video-alone). We found that audiovisual speech perception activated a network of brain regions that included cortical motor areas involved in planning and executing speech production and areas subserving proprioception related to speech production. These regions included the posterior part of the superior temporal gyrus and sulcus, the pars opercularis, premotor cortex, adjacent primary motor cortex, somatosensory cortex, and the cerebellum. Activity in premotor cortex and posterior superior temporal gyrus and sulcus was modulated by the amount of visually distinguishable phonemes in the stories. None of these regions was activated to the same extent in the audio- or video-alone conditions. These results suggest that integrating observed facial movements into the speech perception process involves a network of multimodal brain regions associated with speech production and that these areas contribute less to speech perception when only auditory signals are present. This distributed network could participate in recognition processing by interpreting visual information about mouth movements as phonetic information based on motor commands that could have generated those movements.

KW - Motor cortical activation

KW - Speech perception

KW - Talking faces

UR - http://www.scopus.com/inward/record.url?scp=14244257268&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=14244257268&partnerID=8YFLogxK

U2 - 10.1016/j.neuroimage.2004.11.006

DO - 10.1016/j.neuroimage.2004.11.006

M3 - Article

C2 - 15734345

AN - SCOPUS:14244257268

VL - 25

SP - 76

EP - 89

JO - NeuroImage

JF - NeuroImage

SN - 1053-8119

IS - 1

ER -