TY - JOUR
T1 - Multimodal feature binding in object memory retrieval using event-related potentials
T2 - Implications for models of semantic memory
AU - Chiang, Hsueh Sheng
AU - Spence, Jeffrey S.
AU - Eroh, Justin T.
AU - Maguire, Mandy J.
AU - Kraut, Michael A.
AU - Hart, John
N1 - Publisher Copyright:
© 2020
PY - 2020/7
Y1 - 2020/7
N2 - To test the hypothesis that semantic processes are represented in multiple subsystems, we recorded electroencephalogram (EEG) as we elicited object memories using the modified Semantic Object Retrieval Test, during which an object feature, presented as a visual word [VW], an auditory word [AW], or a picture [Pic], was followed by a second feature always presented as a visual word. We performed both hypothesis-driven and data-driven analyses using event-related potentials (ERPs) time locked to the second stimulus. We replicated a previously reported left fronto-temporal ERP effect (750–1000 ms post-stimulus) in the VW task, and also found that this ERP component was only present during object memory retrieval in verbal (VW, AW) as opposed to non-verbal (Pic) stimulus types. We also found a right temporal ERP effect (850–1000 ms post-stimulus) that was present in auditory (AW) but not in visual (VW, Pic) stimulus types. In addition, we found an earlier left temporo-parietal ERP effect between 350 and 700 ms post-stimulus and a later midline parietal ERP effect between 700 and 1100 ms post-stimulus, present in all stimulus types, suggesting common neural mechanisms for object retrieval processes and object activation, respectively. These findings support multiple semantic subsystems that respond to varying stimulus modalities, and argue against an ultimate unitary amodal semantic analysis.
AB - To test the hypothesis that semantic processes are represented in multiple subsystems, we recorded electroencephalogram (EEG) as we elicited object memories using the modified Semantic Object Retrieval Test, during which an object feature, presented as a visual word [VW], an auditory word [AW], or a picture [Pic], was followed by a second feature always presented as a visual word. We performed both hypothesis-driven and data-driven analyses using event-related potentials (ERPs) time locked to the second stimulus. We replicated a previously reported left fronto-temporal ERP effect (750–1000 ms post-stimulus) in the VW task, and also found that this ERP component was only present during object memory retrieval in verbal (VW, AW) as opposed to non-verbal (Pic) stimulus types. We also found a right temporal ERP effect (850–1000 ms post-stimulus) that was present in auditory (AW) but not in visual (VW, Pic) stimulus types. In addition, we found an earlier left temporo-parietal ERP effect between 350 and 700 ms post-stimulus and a later midline parietal ERP effect between 700 and 1100 ms post-stimulus, present in all stimulus types, suggesting common neural mechanisms for object retrieval processes and object activation, respectively. These findings support multiple semantic subsystems that respond to varying stimulus modalities, and argue against an ultimate unitary amodal semantic analysis.
KW - EEG
KW - ERP
KW - Object features
KW - Object memory
KW - Semantic memory
KW - Semantic systems
UR - http://www.scopus.com/inward/record.url?scp=85084761095&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084761095&partnerID=8YFLogxK
U2 - 10.1016/j.ijpsycho.2020.04.024
DO - 10.1016/j.ijpsycho.2020.04.024
M3 - Article
C2 - 32389620
AN - SCOPUS:85084761095
SN - 0167-8760
VL - 153
SP - 116
EP - 126
JO - International Journal of Psychophysiology
JF - International Journal of Psychophysiology
ER -