ISSN 1671-3710
CN 11-4766/R
主办:中国科学院心理研究所
出版:科学出版社

心理科学进展 ›› 2015, Vol. 23 ›› Issue (7): 1109-1117.doi: 10.3724/SP.J.1042.2015.01109

• 研究构想 •    下一篇

动态面孔和语音情绪信息的整合加工及神经生理机制

王苹;潘治辉;张立洁;陈煦海   

  1. (陕西省行为与认知神经科学重点实验室, 陕西师范大学心理学院, 西安 710062)
  • 收稿日期:2014-05-19 出版日期:2015-07-15 发布日期:2015-07-15
  • 通讯作者: 陈煦海, E-mail: shiningocean@snnu.edu.cn shiningoceanchan@gmail.com
  • 基金资助:

    国家自然科学基金项目(31300835), 教育部人文社科基金(12XJC190003), 中央高校基本科研业务费(14SZYB07)资助。

The Integration of Dynamic Facial and Vocal Emotion and Its Neurophysiological Mechanism

WANG Ping; PAN Zhihui; ZHANG Lijie; CHEN Xuhai   

  1. (Shaanxi Province Key Laboratory of Behavior and Cognitive Neuroscience, School of Psychology, Shaanxi Normal University, Xi’an 710062, China)
  • Received:2014-05-19 Online:2015-07-15 Published:2015-07-15
  • Contact: CHEN Xuhai, E-mail: shiningocean@snnu.edu.cn shiningoceanchan@gmail.com

摘要:

面孔、语音情绪信息的整合加工是社会交往的重要技能, 近年来逐渐引起心理学、神经科学研究的关注。当前研究较为系统地考察了双通道情绪信息整合加工的行为表现和影响因素, 也很好地回答了“何时整合”与“在哪里整合”两个认知神经科学关注的问题, 但对“面孔、语音情绪信息能否整合为一致情绪客体?双通道情绪信息在大脑中如何整合为一?”两个关键问题都还缺乏系统研究。因此, 本项目拟系统操纵面孔、语音刺激的情绪凸显度和任务要求, 引入动态面孔-语音刺激以增加外部效度, 综合运用行为和电生理技术, 从多角度挖掘数据, 特别是引入神经振荡(时频、相干)分析, 系统考察动态性面孔和语音情绪信息是否能整合成一致情绪客体, 并在神经振荡层面探明双通道情绪信息整合的机制。

关键词: 面孔情绪, 语音情绪, 整合加工, 神经振荡, 事件相关电位

Abstract:

The integration of facial-vocal emotion is an important factor for successful communication that intrigue psychologists and neuroscientists in recent years. Previous studies have elaborated on the behavioral performance and the influence factors for facial-vocal emotion integration, as well as “when” and “where” information from the two modes integrated. However, it remains open questions whether the integration of facial-vocal emotion follows the principles of multisensory integration (eg.the principle of inverse effectiveness), and how the bimodal emotional information merges into a coherence emotional object. Therefore, taking “whether facial-vocal emotion integration obeys the principle of inverse effectiveness” as main line, we designed six experiments which manipulated emotional salience of the dynamic facial-vocal emotional stimuli and task demands systematically. Moreover, using multi-dimensional analysis of behavioral and EEG data, especially time-frequency and coherence analysis of EEG data, we aimed to answer the two proposed questions, to further reveal the neurophysiological mechanism of facial-vocal emotion integration.

Key words: facial emotion, vocal emotion, integration, neural oscillation, ERPs