Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types

Hsueh Sheng Chiang, Justin Eroh, Jeffrey S. Spence, Michael A. Motes, Mandy J. Maguire, Daniel C. Krawczyk, Matthew R. Brier, John Hart, Michael A. Kraut

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

How the brain combines the neural representations of features that comprise an object in order to activate a coherent object memory is poorly understood, especially when the features are presented in different modalities (visual vs. auditory) and domains (verbal vs. nonverbal). We examined this question using three versions of a modified Semantic Object Retrieval Test, where object memory was probed by a feature presented as a written word, a spoken word, or a picture, followed by a second feature always presented as a visual word. Participants indicated whether each feature pair elicited retrieval of the memory of a particular object. Sixteen subjects completed one of the three versions (N = 48 in total) while their EEG were recorded simultaneously. We analyzed EEG data in four separate frequency bands (delta: 1–4 Hz, theta: 4–7 Hz; alpha: 8–12 Hz; beta: 13–19 Hz) using a multivariate data-driven approach. We found that alpha power time-locked to response was modulated by both cross-modality (visual vs. auditory) and cross-domain (verbal vs. nonverbal) probing of semantic object memory. In addition, retrieval trials showed greater changes in all frequency bands compared to non-retrieval trials across all stimulus types in both response-locked and stimulus-locked analyses, suggesting dissociable neural subcomponents involved in binding object features to retrieve a memory. We conclude that these findings support both modality/domain-dependent and modality/domain-independent mechanisms during semantic object memory retrieval.

Original languageEnglish (US)
Pages (from-to)77-86
Number of pages10
JournalInternational Journal of Psychophysiology
Volume106
DOIs
StatePublished - Aug 1 2016

Fingerprint

Semantics
Electroencephalography
Brain

Keywords

  • EEG
  • Memory retrieval
  • Neural oscillations
  • Semantics

ASJC Scopus subject areas

  • Neuroscience(all)
  • Neuropsychology and Physiological Psychology
  • Physiology (medical)

Cite this

Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types. / Chiang, Hsueh Sheng; Eroh, Justin; Spence, Jeffrey S.; Motes, Michael A.; Maguire, Mandy J.; Krawczyk, Daniel C.; Brier, Matthew R.; Hart, John; Kraut, Michael A.

In: International Journal of Psychophysiology, Vol. 106, 01.08.2016, p. 77-86.

Research output: Contribution to journalArticle

Chiang, Hsueh Sheng ; Eroh, Justin ; Spence, Jeffrey S. ; Motes, Michael A. ; Maguire, Mandy J. ; Krawczyk, Daniel C. ; Brier, Matthew R. ; Hart, John ; Kraut, Michael A. / Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types. In: International Journal of Psychophysiology. 2016 ; Vol. 106. pp. 77-86.
@article{d196ce49dc7a4257aaf8b091175ca1fb,
title = "Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types",
abstract = "How the brain combines the neural representations of features that comprise an object in order to activate a coherent object memory is poorly understood, especially when the features are presented in different modalities (visual vs. auditory) and domains (verbal vs. nonverbal). We examined this question using three versions of a modified Semantic Object Retrieval Test, where object memory was probed by a feature presented as a written word, a spoken word, or a picture, followed by a second feature always presented as a visual word. Participants indicated whether each feature pair elicited retrieval of the memory of a particular object. Sixteen subjects completed one of the three versions (N = 48 in total) while their EEG were recorded simultaneously. We analyzed EEG data in four separate frequency bands (delta: 1–4 Hz, theta: 4–7 Hz; alpha: 8–12 Hz; beta: 13–19 Hz) using a multivariate data-driven approach. We found that alpha power time-locked to response was modulated by both cross-modality (visual vs. auditory) and cross-domain (verbal vs. nonverbal) probing of semantic object memory. In addition, retrieval trials showed greater changes in all frequency bands compared to non-retrieval trials across all stimulus types in both response-locked and stimulus-locked analyses, suggesting dissociable neural subcomponents involved in binding object features to retrieve a memory. We conclude that these findings support both modality/domain-dependent and modality/domain-independent mechanisms during semantic object memory retrieval.",
keywords = "EEG, Memory retrieval, Neural oscillations, Semantics",
author = "Chiang, {Hsueh Sheng} and Justin Eroh and Spence, {Jeffrey S.} and Motes, {Michael A.} and Maguire, {Mandy J.} and Krawczyk, {Daniel C.} and Brier, {Matthew R.} and John Hart and Kraut, {Michael A.}",
year = "2016",
month = "8",
day = "1",
doi = "10.1016/j.ijpsycho.2016.06.011",
language = "English (US)",
volume = "106",
pages = "77--86",
journal = "International Journal of Psychophysiology",
issn = "0167-8760",
publisher = "Elsevier",

}

TY - JOUR

T1 - Common and differential electrophysiological mechanisms underlying semantic object memory retrieval probed by features presented in different stimulus types

AU - Chiang, Hsueh Sheng

AU - Eroh, Justin

AU - Spence, Jeffrey S.

AU - Motes, Michael A.

AU - Maguire, Mandy J.

AU - Krawczyk, Daniel C.

AU - Brier, Matthew R.

AU - Hart, John

AU - Kraut, Michael A.

PY - 2016/8/1

Y1 - 2016/8/1

N2 - How the brain combines the neural representations of features that comprise an object in order to activate a coherent object memory is poorly understood, especially when the features are presented in different modalities (visual vs. auditory) and domains (verbal vs. nonverbal). We examined this question using three versions of a modified Semantic Object Retrieval Test, where object memory was probed by a feature presented as a written word, a spoken word, or a picture, followed by a second feature always presented as a visual word. Participants indicated whether each feature pair elicited retrieval of the memory of a particular object. Sixteen subjects completed one of the three versions (N = 48 in total) while their EEG were recorded simultaneously. We analyzed EEG data in four separate frequency bands (delta: 1–4 Hz, theta: 4–7 Hz; alpha: 8–12 Hz; beta: 13–19 Hz) using a multivariate data-driven approach. We found that alpha power time-locked to response was modulated by both cross-modality (visual vs. auditory) and cross-domain (verbal vs. nonverbal) probing of semantic object memory. In addition, retrieval trials showed greater changes in all frequency bands compared to non-retrieval trials across all stimulus types in both response-locked and stimulus-locked analyses, suggesting dissociable neural subcomponents involved in binding object features to retrieve a memory. We conclude that these findings support both modality/domain-dependent and modality/domain-independent mechanisms during semantic object memory retrieval.

AB - How the brain combines the neural representations of features that comprise an object in order to activate a coherent object memory is poorly understood, especially when the features are presented in different modalities (visual vs. auditory) and domains (verbal vs. nonverbal). We examined this question using three versions of a modified Semantic Object Retrieval Test, where object memory was probed by a feature presented as a written word, a spoken word, or a picture, followed by a second feature always presented as a visual word. Participants indicated whether each feature pair elicited retrieval of the memory of a particular object. Sixteen subjects completed one of the three versions (N = 48 in total) while their EEG were recorded simultaneously. We analyzed EEG data in four separate frequency bands (delta: 1–4 Hz, theta: 4–7 Hz; alpha: 8–12 Hz; beta: 13–19 Hz) using a multivariate data-driven approach. We found that alpha power time-locked to response was modulated by both cross-modality (visual vs. auditory) and cross-domain (verbal vs. nonverbal) probing of semantic object memory. In addition, retrieval trials showed greater changes in all frequency bands compared to non-retrieval trials across all stimulus types in both response-locked and stimulus-locked analyses, suggesting dissociable neural subcomponents involved in binding object features to retrieve a memory. We conclude that these findings support both modality/domain-dependent and modality/domain-independent mechanisms during semantic object memory retrieval.

KW - EEG

KW - Memory retrieval

KW - Neural oscillations

KW - Semantics

UR - http://www.scopus.com/inward/record.url?scp=84976299584&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84976299584&partnerID=8YFLogxK

U2 - 10.1016/j.ijpsycho.2016.06.011

DO - 10.1016/j.ijpsycho.2016.06.011

M3 - Article

C2 - 27329353

AN - SCOPUS:84976299584

VL - 106

SP - 77

EP - 86

JO - International Journal of Psychophysiology

JF - International Journal of Psychophysiology

SN - 0167-8760

ER -