學術產出-Periodical Articles

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 The visuo-sensorimotor substrate of co-speech gesture processing
作者 張葶葶; 徐嘉慧
Chang, Ting-Ting;Chui, Kawai;Ng, Chan-Tat
貢獻者 心理系; 英文系
關鍵詞 Iconic gesture; Beat gesture; Self-adaptor; Gesture-speech processing; Functional MRI
日期 2023-11
上傳時間 5-Mar-2024 16:27:11 (UTC+8)
摘要 Co-speech gestures are integral to human communication and exhibit diverse forms, each serving a distinct communication function. However, existing literature has focused on individual gesture types, leaving a gap in understanding the comparative neural processing of these diverse forms. To address this, our study investigated the neural processing of two types of iconic gestures: those representing attributes or event knowledge of entity concepts, beat gestures enacting rhythmic manual movements without semantic information, and self-adaptors. During functional magnetic resonance imaging, systematic randomization and attentive observation of video stimuli revealed a general neural substrate for co-speech gesture processing primarily in the bilateral middle temporal and inferior parietal cortices, characterizing visuospatial attention, semantic integration of cross-modal information, and multisensory processing of manual and audiovisual inputs. Specific types of gestures and grooming movements elicited distinct neural responses. Greater activity in the right supramarginal and inferior frontal regions was specific to self-adaptors, and is relevant to the spatiomotor and integrative processing of speech and gestures. The semantic and sensorimotor regions were least active for beat gestures. The processing of attribute gestures was most pronounced in the left posterior middle temporal gyrus upon access to knowledge of entity concepts. This fMRI study illuminated the neural underpinnings of gesture-speech integration and highlighted the differential processing pathways for various co-speech gestures.
關聯 Neuropsychologia, Vol.190, 108697
資料類型 article
DOI https://doi.org/10.1016/j.neuropsychologia.2023.108697
dc.contributor 心理系; 英文系-
dc.creator (作者) 張葶葶; 徐嘉慧-
dc.creator (作者) Chang, Ting-Ting;Chui, Kawai;Ng, Chan-Tat-
dc.date (日期) 2023-11-
dc.date.accessioned 5-Mar-2024 16:27:11 (UTC+8)-
dc.date.available 5-Mar-2024 16:27:11 (UTC+8)-
dc.date.issued (上傳時間) 5-Mar-2024 16:27:11 (UTC+8)-
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/150410-
dc.description.abstract (摘要) Co-speech gestures are integral to human communication and exhibit diverse forms, each serving a distinct communication function. However, existing literature has focused on individual gesture types, leaving a gap in understanding the comparative neural processing of these diverse forms. To address this, our study investigated the neural processing of two types of iconic gestures: those representing attributes or event knowledge of entity concepts, beat gestures enacting rhythmic manual movements without semantic information, and self-adaptors. During functional magnetic resonance imaging, systematic randomization and attentive observation of video stimuli revealed a general neural substrate for co-speech gesture processing primarily in the bilateral middle temporal and inferior parietal cortices, characterizing visuospatial attention, semantic integration of cross-modal information, and multisensory processing of manual and audiovisual inputs. Specific types of gestures and grooming movements elicited distinct neural responses. Greater activity in the right supramarginal and inferior frontal regions was specific to self-adaptors, and is relevant to the spatiomotor and integrative processing of speech and gestures. The semantic and sensorimotor regions were least active for beat gestures. The processing of attribute gestures was most pronounced in the left posterior middle temporal gyrus upon access to knowledge of entity concepts. This fMRI study illuminated the neural underpinnings of gesture-speech integration and highlighted the differential processing pathways for various co-speech gestures.-
dc.format.extent 118 bytes-
dc.format.mimetype text/html-
dc.relation (關聯) Neuropsychologia, Vol.190, 108697-
dc.subject (關鍵詞) Iconic gesture; Beat gesture; Self-adaptor; Gesture-speech processing; Functional MRI-
dc.title (題名) The visuo-sensorimotor substrate of co-speech gesture processing-
dc.type (資料類型) article-
dc.identifier.doi (DOI) 10.1016/j.neuropsychologia.2023.108697-
dc.doi.uri (DOI) https://doi.org/10.1016/j.neuropsychologia.2023.108697-