[1] | 张亮, 孙向红, 张侃 . (2009). 情绪信息的多通道整合. 心理科学进展, 17(6), 1133-1138. | [2] | 王苹, 潘治辉, 张立洁, 陈煦海 . (2015). 动态面孔和语音情绪信息的整合加工及神经生理机制. 心理科学进展, 23(7), 1109-1117. | [3] | Armony J. L., & Dolan R. J . (2000). Modulation of attention by threat stimuli: An fMRI study. Journal of Cognitive Neuroscience, 53-53. | [4] | Balconi M., & Carrera A. , (2011). Cross-modal integration of emotional face and voice in congruous and incongruous pairs: The P2 ERP effect. Journal of Cognitive Psychology, 23(1), 132-139. | [5] | Belyk M., Brown S., Lim J., & Kotz S. A . (2017). Convergence of semantics and emotional expression within the IFG pars orbitalis. Neuroimage, 156, 240-248. | [6] | Calvo M. G., Beltran D., & Fernandez-Martin A . (2014). Processing of facial expressions in peripheral vision: Neurophysiological evidence. Biological Psychology, 100, 60-70. | [7] | Calvo M. G., & Nummenmaa L. , (2016). Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6), 1081-1106. | [8] | Campanella S., & Belin P. , (2007). Integrating face and voice in person perception. Trends in Cognitive Sciences, 11(12), 535-543. | [9] | Campanella S., Bruyer R., Froidbise S., Rossignol M., Joassin F., Kornreich C., ... Verbanck P . (2010). Is two better than one? A cross-modal oddball paradigm reveals greater sensitivity of the P300 to emotional face-voice associations. Clinical Neurophysiology, 121(11), 1855-1862. | [10] | Chen X. H., Edgar J. C., Holroyd T., Dammers J., Thoennessen H., Roberts T. P. L., & Mathiak K . (2010). Neuromagnetic oscillations to emotional faces and prosody. European Journal of Neuroscience, 31(10), 1818-1827. | [11] | Chen X. H., Han L. Z., Pan Z. H., Luo Y. M., & Wang P . (2016). Influence of attention on bimodal integration during emotional change decoding: ERP evidence. International Journal of Psychophysiology, 106, 14-20. | [12] | Chen X. H., Pan Z. H., Wang P., Yang X. H., Liu P., You X. Q., & Yuan J. J . (2016). The integration of facial and vocal cues during emotional change perception: EEG markers. Social Cognitive and Affective Neuroscience, 11(7), 1152-1161. | [13] | Chen X. H., Pan Z. H., Wang P., Zhang L. J., & Yuan J. J . (2015). EEG oscillations reflect task effects for the change detection in vocal emotion. Cognitive Neurodynamics, 9(3), 351-358. | [14] | Chen X. H., Yang J. F., Gan S. Z., & Yang Y. F . (2012). The contribution of sound intensity in vocal emotion perception: Behavioral and electrophysiological evidence. PLoS One, 7(1), e30278. | [15] | Collignon O., Girard S., Gosselin F., Roy S., Saint-Amour D., Lassonde M., & Lepore F . (2008). Audio-visual integration of emotion expression. Brain Research, 1242, 126-135. | [16] | Cuthbert B. N., Schupp H. T., Bradley M. M., Birbaumer N., & Lang P. J . (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95-111. | [17] | de Gelder B., & Vroomen J., (2000). The perception of emotions by ear and by eye. Cognition and Emotion, 14(3), 289-311. | [18] | Delle-Vigne D., Kornreich C., Verbanck P., & Campanella S . (2015). The P300 component wave reveals differences in subclinical anxious-depressive states during bimodal oddball tasks: An effect of stimulus congruence. Clinical Neurophysiology, 126(11), 2108-2123. | [19] | Ding R., Li P., Wang W., & Luo W . (2017). Emotion processing by ERP combined with development and plasticity. Neural Plasticity, 2017(2), 5282670. | [20] | Doi H., & Shinohara K. , (2015). Unconscious presentation of fearful face modulates electrophysiological responses to emotional prosody. Cerebral Cortex, 25(3), 817-832. | [21] | Dolan R. J., Morris J. S., & .. Gelder B . (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences of the United States of America, 98(17), 10006-10010. | [22] | Epperson C. N., Amin Z., Ruparel K., Gur R., & Loughead J . (2012). Interactive effects of estrogen and serotonin on brain activation during working memory and affective processing in menopausal women. Psychoneuroendocrinology, 37(3), 372-382. | [23] | Ethofer T., Anders S., Erb M., Herbert C., Wiethoff S., Kissler J., ... Wildgruber D . (2006). Cerebral pathways in processing of affective prosody: A dynamic causal modeling study. Neuroimage, 30(2), 580-597. | [24] | Ethofer T., Pourtois G., & Wildgruber D . (2006). Investigating audiovisual integration of emotional signals in the human brain. Progress in Brain Research, 156(6), 345-361. | [25] | Fingelkurts A. A., Fingelkurts A. A., & Seppo K. H. N ., (2005). Functional connectivity in the brain--Is it an elusive concept? Neuroscience & Biobehavioral Reviews, 28(8), 827-836. | [26] | Focker J., Gondan M., & Roder B . (2011). Preattentive processing of audio-visual emotional signals. Acta Psychologica, 137(1), 36-47. | [27] | Gao Z. F., Goldstein A., Harpaz Y., Hansel M., Zion-Golumbic E., & Bentin S . (2013). A magnetoencephalographic study of face processing: M170, gamma-band oscillations and source localization. Human Brain Mapping, 34(8), 1783-1795. | [28] | Hagan C. C., Woods W., Johnson S., Calder A. J., Green G. G. R., & Young A. W . (2009). MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcus. Proceedings of the National Academy of Sciences of the United States of America, 106(47), 20010-20015. | [29] | Hagan C. C., Woods W., Johnson S., Green G. G. R., & Young A. W . (2013). Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG. PLoS One, 8(8), e70648. | [30] | Hernandez-Gutierrez D., Abdel Rahman R., Martin-Loeches M., Munoz F., Schacht A., & Sommer W . (2018). Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence. Cortex, 104, 12-25. | [31] | Ho H. T., Schroger E., & Kotz S. A . (2015). Selective attention modulates early human evoked potentials during emotional face-voice processing. Journal of Cognitive Neuroscience, 27(4), 798-818. | [32] | Huang X. Q., Zhang J., Liu J., Sun L., Zhao H. Y., Lu Y. G., ... Li J . (2012). C-reactive protein promotes adhesion of monocytes to endothelial cells via NADPH oxidase-mediated oxidative stress. Journal of Cellular Biochemistry, 113(3), 857-867. | [33] | Jessen S., & Kotz S. A . (2011). The temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage, 58(2), 665-674. | [34] | Jia G., Peng X., Li Y., Hua S., & Zhao X. J . (2012). The oscillatory activities and its synchronization in auditory-visual integration as revealed by event-related potentials to bimodal stimuli. Proceedings of SPIE - The International Society for Optical Engineering, 8291(1), 52. | [35] | Jochen K., Ingo H., Hermann A., Klaus M., & Werner L . (2005). Hearing lips: Gamma-band activity during audiovisual speech perception. Cerebral Cortex, 15(5), 646-653. | [36] | Klasen M., Chen Y. H., & Mathiak K . (2012). Multisensory emotions: Perception, combination and underlying neural processes. Reviews in the Neurosciences, 23(4), 381-392. | [37] | Klasen M., Kenworthy C. A., Mathiak K. A., Kircher T. T. J., & Mathiak K . (2011). Supramodal representation of emotions. Journal of Neuroscience, 31(38), 15218-15218. | [38] | Klasen M., Kreifelts B., Chen Y. H., Seubert J., & Mathiak K . (2014). Neural processing of emotion in multimodal settings. Frontiers in Human Neuroscience, 8(8), 822. | [39] | Knowland V. C. P., Mercure E., Karmiloff-Smith A., Dick F., & Thomas M. S. C ., (2014). Audio-visual speech perception: A developmental ERP investigation. Developmental Science, 17(1), 110-124. | [40] | Kober H., Barrett L. F., Joseph J., Bliss-Moreau E., Lindquist K., & Wager T. D . (2008). Functional grouping and cortical-subcortical interactions in emotion: A meta-analysis of neuroimaging studies. Neuroimage, 42(2), 998-1031. | [41] | Kokinous J., Kotz S. A., Tavano A., & Schroger E . (2015). The role of emotion in dynamic audiovisual integration of faces and voices. Social Cognitive and Affective Neuroscience, 10(5), 713-720. | [42] | Kokinous J., Tavano A., Kotz S. A., & Schroeger E . (2017). Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency. Biological Psychology, 123, 155-165. | [43] | Kreifelts B., Ethofer T., Grodd W., Erb M., & Wildgruber D . (2007). Audiovisual integration of emotional signals in voice and face: An event-related fMRI study. Neuroimage, 37(4), 1445-1456. | [44] | Kreifelts B., Ethofer T., Huberle E., Grodd W., & Wildgruber D . (2010). Association of trait emotional intelligence and individual fMRI-activation patterns during the perception of social signals from voice and face. Human Brain Mapping, 31(7), 979-991. | [45] | Kreifelts B., Ethofer T., Shiozawa T., Grodd W., & Wildgruber D . (2009). Cerebral representation of non-verbal emotional perception: fMRI reveals audiovisual integration area between voice-and face-sensitive regions in the superior temporal sulcus. Neuropsychologia, 47(14), 3059-3066. | [46] | Kuhn L. K., Wydell T., Lavan N., McGettigan C., & Garrido L . (2017). Similar representations of emotions across faces and voices. Emotion, 17(6), 912-937. | [47] | Kumar G. V., Kumar N., Roy D., & Banerjee A . (2018). Segregation and integration of cortical information processing underlying cross-modal perception. Multisensory Research, 31(5), 481-500. | [48] | Lin Y. F., Liu B. L., Liu Z. W., & Gao X. R . (2015). EEG gamma-band activity during audiovisual speech comprehension in different noise environments. Cognitive Neurodynamics, 9(4), 389-398. | [49] | Liu P., Rigoulot S., & Pell M. D . (2015). Culture modulates the brain response to human expressions of emotion: Electrophysiological evidence. Neuropsychologia, 67, 1-13. | [50] | Maier J. X., Chandrasekaran C., & Ghazanfar A. A . (2008). Integration of bimodal looming signals through neuronal coherence in the temporal lobe. Current Biology, 18(13), 963-968. | [51] | Mileva M., Tompkinson J., Watt D., & Burton A. M . (2018). Audiovisual integration in social evaluation. Journal of Experimental Psychology: Human Perception and Performance, 44(1), 128-138. | [52] | Muller V. I., Cieslik E. C., Turetsky B. I., & Eickhoff S. B . (2012). Crossmodal interactions in audiovisual emotion processing. Neuroimage, 60(1), 553-561. | [53] | Noy D., Mouta S., Lamas J., Basso D., Silva C., & Santos J. A . (2017). Audiovisual integration increases the intentional step synchronization of side-by-side walkers. Human Movement Science, 56, 71-87. | [54] | Olofsson J. K., & Polich J. , (2007). Affective visual event-related potentials: Arousal, repetition, and time-on-task. Biological Psychology, 75(1), 101-108. | [55] | Pan Z. H., Liu X., Luo Y. M., & Chen X. H . (2017). Emotional intensity modulates the integration of bimodal angry expressions: ERP evidence. Frontiers in Neuroscience, 11. | [56] | Park J. Y., Gu B. M., Kang D. H., Shin Y. W., Choi C. H., Lee J. M., & Kwon J. S . (2010). Integration of cross-modal emotional information in the human brain: An fMRI study. Cortex, 46(2), 161-169. | [57] | Paulmann S., Jessen S., & Kotz S. A . (2009). Investigating the multimodal nature of human communication insights from ERPs. Journal of Psychophysiology, 23(2), 63-76. | [58] | Paulmann S., & Pell M. D . (2010a). Contextual influences of emotional speech prosody on face processing: How much is enough? Cognitive Affective & Behavioral Neuroscience, 10(2), 230-242. | [59] | Paulmann S., & Pell M. D . (2010b). Dynamic emotion processing in Parkinson's disease as a function of channel availability. Journal of Clinical and Experimental Neuropsychology, 32(8), 822-835. | [60] | Pourtois G., de Gelder B., Vroomen J., Rossion B., & Crommelinck M . (2000). The time-course of intermodal binding between seeing and hearing affective information. Neuroreport, 11(6), 1329-1333. | [61] | Pourtois G., Debatisse D., Despland P. A., & de Gelder B . (2002). Facial expressions modulate the time course of long latency auditory brain potentials. Cognitive Brain Research, 14(1), 99-105. | [62] | Pourtois G., Thut G., de Peralta R. G., Michel C., & Vuilleumier P . (2005). Two electrophysiological stages of spatial orienting towards fearful faces: Early temporo-parietal activation preceding gain control in extrastriate visual cortex. Neuroimage, 26(1), 149-163. | [63] | Proverbio A. M., & De Benedetto , F. (2018). Auditory enhancement of visual memory encoding is driven by emotional content of the auditory material and mediated by superior frontal cortex. Biological Psychology, 132, 164-175. | [64] | Robins D. L., Hunyadi E., & Schultz R. T . (2009). Superior temporal activation in response to dynamic audio-visual emotional cues. Brain and Cognition, 69(2), 269-278. | [65] | Romero Y. R., Senkowski D., & Keil J . (2015). Early and late beta-band power reflect audiovisual perception in the McGurk illusion. Journal of Neurophysiology, 113(7), 2342-2350. | [66] | Schelenz P. D., Klasen M., Reese B., Regenbogen C., Wolf D., Kato Y., & Mathiak K . (2013). Multisensory integration of dynamic emotional faces and voices: Method for simultaneous EEG-fMRI measurements. Frontiers in Human Neuroscience, 7(1), 729. | [67] | Schupp H. T., Stockburger J., Codispoti M., Junghoefer M., Weike A. I., & Hamm A. O . (2007). Selective visual attention to emotion. Journal of Neuroscience, 27(5), 1082-1089. | [68] | Simon D. M., & Wallace M. T . (2018). Integration and temporal processing of asynchronous audiovisual speech. Journal of Cognitive Neuroscience, 30(3), 319-337. | [69] | Stein B. E., & Stanford T. R . (2008). Multisensory integration: Current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9(4), 255-266. | [70] | Stein B. E., Stanford T. R., Ramachandran R., de Perrault T. J., & Rowland B. A . (2009). Challenges in quantifying multisensory integration: Alternative criteria, models, and inverse effectiveness. Experimental Brain Research, 198(2-3), 113-126. | [71] | Strelnikov K., Foxton J., Marx M., & Barone P . (2015). Brain prediction of auditory emphasis by facialS expressions during audiovisual continuous speech. Brain Topography, 28(3), 494-505. | [72] | Symons A. E., El-Deredy W., Schwartze M., & Kotz S. A . (2016). The functional role ofneural oscillations in non-verbal emotional communication. Frontiers in Human Neuroscience, 10, 239. | [73] | Tallon-Baudry C., & Bertrand O. , (1999). Oscillatory gamma activity in humans and its role in object representation. Trends in Cognitive Sciences, 3(4), 151-162. | [74] | Tang X. Y., Wu J. L., & Shen Y . (2016). The interactions of multisensory integration with endogenous and exogenous attention. Neuroscience and Biobehavioral Reviews, 61, 208-224. | [75] | Van Kleef, G. A . (2009). How emotions regulate social life: The emotions as social information (EASI) model. Current Directions in Psychological Science, 18(3), 184-188. | [76] | van Wassenhove V., Grant K. W., & Poeppel D . (2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the United States of America, 102(4), 1181-1186. | [77] | Yang C. Y., & Lin C. P . (2017). Magnetoencephalography study of different relationships among low- and high-frequency-band neural activities during the induction of peaceful and fearful audiovisual modalities among males and females. Journal of Neuroscience Research, 95(1-2), 176-188. | [78] | Yaple Z. A., Vakhrushev R., & Jolij J . (2016). Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices Frontiers in Neuroscience, 10, 305. | [79] | Yeh P. w., Geangu E., & Reid V . (2016). Coherent emotional perception from body expressions and the voice. Neuropsychologia, 91, 99-108. | [80] | Zhu L. L., & Beauchamp M. S . (2017). Mouth and voice: A relationship between visual and auditory preference in the human superior temporal sulcus. Journal of Neuroscience, 37(10), 2697-2708. | [81] | Zinchenko A., Obermeier C., Kanske P., Schroger E., & Kotz S. A . (2017). Positive emotion impedes emotional but not cognitive conflict processing. Cognitive Affective & Behavioral Neuroscience, 17(3), 665-677. |
|