Co-speech gestures influence neural activity in brain regions associated with processing semantic information

Anthony Steven Dick, Susan Goldin-Meadow, Uri Hasson, Jeremy I. Skipper, Steven L. Small

Research output: Contribution to journalArticle

95 Citations (Scopus)

Abstract

Everyday communication is accompanied by visual information fromseveral sources, including cospeech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior andmiddle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.

Original languageEnglish (US)
Pages (from-to)3509-3526
Number of pages18
JournalHuman Brain Mapping
Volume30
Issue number11
DOIs
StatePublished - Nov 1 2009
Externally publishedYes

Fingerprint

Gestures
Automatic Data Processing
Semantics
Hand
Brain
Temporal Lobe
Prefrontal Cortex
Motion Perception
Communication
Magnetic Resonance Imaging

Keywords

  • Discourse comprehension
  • fMRI
  • Gestures
  • Inferior frontal gyrus
  • Semantic processing

ASJC Scopus subject areas

  • Anatomy
  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging
  • Neurology
  • Clinical Neurology

Cite this

Co-speech gestures influence neural activity in brain regions associated with processing semantic information. / Dick, Anthony Steven; Goldin-Meadow, Susan; Hasson, Uri; Skipper, Jeremy I.; Small, Steven L.

In: Human Brain Mapping, Vol. 30, No. 11, 01.11.2009, p. 3509-3526.

Research output: Contribution to journalArticle

Dick, Anthony Steven ; Goldin-Meadow, Susan ; Hasson, Uri ; Skipper, Jeremy I. ; Small, Steven L. / Co-speech gestures influence neural activity in brain regions associated with processing semantic information. In: Human Brain Mapping. 2009 ; Vol. 30, No. 11. pp. 3509-3526.
@article{17a2b22fb61341b7b16acc700d036194,
title = "Co-speech gestures influence neural activity in brain regions associated with processing semantic information",
abstract = "Everyday communication is accompanied by visual information fromseveral sources, including cospeech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior andmiddle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.",
keywords = "Discourse comprehension, fMRI, Gestures, Inferior frontal gyrus, Semantic processing",
author = "Dick, {Anthony Steven} and Susan Goldin-Meadow and Uri Hasson and Skipper, {Jeremy I.} and Small, {Steven L.}",
year = "2009",
month = "11",
day = "1",
doi = "10.1002/hbm.20774",
language = "English (US)",
volume = "30",
pages = "3509--3526",
journal = "Human Brain Mapping",
issn = "1065-9471",
publisher = "Wiley-Liss Inc.",
number = "11",

}

TY - JOUR

T1 - Co-speech gestures influence neural activity in brain regions associated with processing semantic information

AU - Dick, Anthony Steven

AU - Goldin-Meadow, Susan

AU - Hasson, Uri

AU - Skipper, Jeremy I.

AU - Small, Steven L.

PY - 2009/11/1

Y1 - 2009/11/1

N2 - Everyday communication is accompanied by visual information fromseveral sources, including cospeech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior andmiddle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.

AB - Everyday communication is accompanied by visual information fromseveral sources, including cospeech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior andmiddle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.

KW - Discourse comprehension

KW - fMRI

KW - Gestures

KW - Inferior frontal gyrus

KW - Semantic processing

UR - http://www.scopus.com/inward/record.url?scp=70350330881&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=70350330881&partnerID=8YFLogxK

U2 - 10.1002/hbm.20774

DO - 10.1002/hbm.20774

M3 - Article

C2 - 19384890

AN - SCOPUS:70350330881

VL - 30

SP - 3509

EP - 3526

JO - Human Brain Mapping

JF - Human Brain Mapping

SN - 1065-9471

IS - 11

ER -