Advances in Psychological Science ›› 2019, Vol. 27 ›› Issue (7): 1205-1214.doi: 10.3724/SP.J.1042.2019.01205
• Regular Articles • Previous Articles Next Articles
LI Ping, ZHANG Mingming, LI Shuaixia, ZHANG Huoyin, LUO Wenbo()
Received:
2018-05-17
Online:
2019-07-15
Published:
2019-05-22
Contact:
LUO Wenbo
E-mail:luowb@lnnu.edu.cn
CLC Number:
LI Ping, ZHANG Mingming, LI Shuaixia, ZHANG Huoyin, LUO Wenbo. The integration of facial expression and vocal emotion and its brain mechanism[J]. Advances in Psychological Science, 2019, 27(7): 1205-1214.
[1] | 张亮, 孙向红, 张侃 . (2009). 情绪信息的多通道整合. 心理科学进展, 17(6), 1133-1138. |
[2] | 王苹, 潘治辉, 张立洁, 陈煦海 . (2015). 动态面孔和语音情绪信息的整合加工及神经生理机制. 心理科学进展, 23(7), 1109-1117. |
[3] | Armony J. L., & Dolan R. J . (2000). Modulation of attention by threat stimuli: An fMRI study. Journal of Cognitive Neuroscience, 53-53. |
[4] |
Balconi M., & Carrera A. , (2011). Cross-modal integration of emotional face and voice in congruous and incongruous pairs: The P2 ERP effect. Journal of Cognitive Psychology, 23(1), 132-139.
doi: 10.1080/20445911.2011.473560 URL |
[5] |
Belyk M., Brown S., Lim J., & Kotz S. A . (2017). Convergence of semantics and emotional expression within the IFG pars orbitalis. Neuroimage, 156, 240-248.
doi: 10.1016/j.neuroimage.2017.04.020 URL |
[6] |
Calvo M. G., Beltran D., & Fernandez-Martin A . (2014). Processing of facial expressions in peripheral vision: Neurophysiological evidence. Biological Psychology, 100, 60-70.
doi: 10.1016/j.biopsycho.2014.05.007 URL |
[7] |
Calvo M. G., & Nummenmaa L. , (2016). Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6), 1081-1106.
doi: 10.1080/02699931.2015.1049124 URL |
[8] |
Campanella S., & Belin P. , (2007). Integrating face and voice in person perception. Trends in Cognitive Sciences, 11(12), 535-543.
doi: 10.1016/j.tics.2007.10.001 URL |
[9] |
Campanella S., Bruyer R., Froidbise S., Rossignol M., Joassin F., Kornreich C., ... Verbanck P . (2010). Is two better than one? A cross-modal oddball paradigm reveals greater sensitivity of the P300 to emotional face-voice associations. Clinical Neurophysiology, 121(11), 1855-1862.
doi: 10.1016/j.clinph.2010.04.004 URL |
[10] |
Chen X. H., Edgar J. C., Holroyd T., Dammers J., Thoennessen H., Roberts T. P. L., & Mathiak K . (2010). Neuromagnetic oscillations to emotional faces and prosody. European Journal of Neuroscience, 31(10), 1818-1827.
doi: 10.1111/ejn.2010.31.issue-10 URL |
[11] |
Chen X. H., Han L. Z., Pan Z. H., Luo Y. M., & Wang P . (2016). Influence of attention on bimodal integration during emotional change decoding: ERP evidence. International Journal of Psychophysiology, 106, 14-20.
doi: 10.1016/j.ijpsycho.2016.05.009 URL |
[12] |
Chen X. H., Pan Z. H., Wang P., Yang X. H., Liu P., You X. Q., & Yuan J. J . (2016). The integration of facial and vocal cues during emotional change perception: EEG markers. Social Cognitive and Affective Neuroscience, 11(7), 1152-1161.
doi: 10.1093/scan/nsv083 URL |
[13] |
Chen X. H., Pan Z. H., Wang P., Zhang L. J., & Yuan J. J . (2015). EEG oscillations reflect task effects for the change detection in vocal emotion. Cognitive Neurodynamics, 9(3), 351-358.
doi: 10.1007/s11571-014-9326-9 URL |
[14] |
Chen X. H., Yang J. F., Gan S. Z., & Yang Y. F . (2012). The contribution of sound intensity in vocal emotion perception: Behavioral and electrophysiological evidence. PLoS One, 7(1), e30278.
doi: 10.1371/journal.pone.0030278 URL |
[15] |
Collignon O., Girard S., Gosselin F., Roy S., Saint-Amour D., Lassonde M., & Lepore F . (2008). Audio-visual integration of emotion expression. Brain Research, 1242, 126-135.
doi: 10.1016/j.brainres.2008.04.023 URL |
[16] |
Cuthbert B. N., Schupp H. T., Bradley M. M., Birbaumer N., & Lang P. J . (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95-111.
doi: 10.1016/S0301-0511(99)00044-7 URL |
[17] |
de Gelder B., & Vroomen J., (2000). The perception of emotions by ear and by eye. Cognition and Emotion, 14(3), 289-311.
doi: 10.1080/026999300378824 URL |
[18] |
Delle-Vigne D., Kornreich C., Verbanck P., & Campanella S . (2015). The P300 component wave reveals differences in subclinical anxious-depressive states during bimodal oddball tasks: An effect of stimulus congruence. Clinical Neurophysiology, 126(11), 2108-2123.
doi: 10.1016/j.clinph.2015.01.012 URL |
[19] | Ding R., Li P., Wang W., & Luo W . (2017). Emotion processing by ERP combined with development and plasticity. Neural Plasticity, 2017(2), 5282670. |
[20] |
Doi H., & Shinohara K. , (2015). Unconscious presentation of fearful face modulates electrophysiological responses to emotional prosody. Cerebral Cortex, 25(3), 817-832.
doi: 10.1093/cercor/bht282 URL |
[21] |
Dolan R. J., Morris J. S., & .. Gelder B . (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences of the United States of America, 98(17), 10006-10010.
doi: 10.1073/pnas.171288598 URL |
[22] |
Epperson C. N., Amin Z., Ruparel K., Gur R., & Loughead J . (2012). Interactive effects of estrogen and serotonin on brain activation during working memory and affective processing in menopausal women. Psychoneuroendocrinology, 37(3), 372-382.
doi: 10.1016/j.psyneuen.2011.07.007 URL |
[23] |
Ethofer T., Anders S., Erb M., Herbert C., Wiethoff S., Kissler J., ... Wildgruber D . (2006). Cerebral pathways in processing of affective prosody: A dynamic causal modeling study. Neuroimage, 30(2), 580-597.
doi: 10.1016/j.neuroimage.2005.09.059 URL |
[24] |
Ethofer T., Pourtois G., & Wildgruber D . (2006). Investigating audiovisual integration of emotional signals in the human brain. Progress in Brain Research, 156(6), 345-361.
doi: 10.1016/S0079-6123(06)56019-4 URL |
[25] | Fingelkurts A. A., Fingelkurts A. A., & Seppo K. H. N ., (2005). Functional connectivity in the brain--Is it an elusive concept? Neuroscience & Biobehavioral Reviews, 28(8), 827-836. |
[26] |
Focker J., Gondan M., & Roder B . (2011). Preattentive processing of audio-visual emotional signals. Acta Psychologica, 137(1), 36-47.
doi: 10.1016/j.actpsy.2011.02.004 URL |
[27] |
Gao Z. F., Goldstein A., Harpaz Y., Hansel M., Zion-Golumbic E., & Bentin S . (2013). A magnetoencephalographic study of face processing: M170, gamma-band oscillations and source localization. Human Brain Mapping, 34(8), 1783-1795.
doi: 10.1002/hbm.v34.8 URL |
[28] |
Hagan C. C., Woods W., Johnson S., Calder A. J., Green G. G. R., & Young A. W . (2009). MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcus. Proceedings of the National Academy of Sciences of the United States of America, 106(47), 20010-20015.
doi: 10.1073/pnas.0905792106 URL |
[29] |
Hagan C. C., Woods W., Johnson S., Green G. G. R., & Young A. W . (2013). Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG. PLoS One, 8(8), e70648.
doi: 10.1371/journal.pone.0070648 URL |
[30] |
Hernandez-Gutierrez D., Abdel Rahman R., Martin-Loeches M., Munoz F., Schacht A., & Sommer W . (2018). Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence. Cortex, 104, 12-25.
doi: 10.1016/j.cortex.2018.03.031 URL |
[31] |
Ho H. T., Schroger E., & Kotz S. A . (2015). Selective attention modulates early human evoked potentials during emotional face-voice processing. Journal of Cognitive Neuroscience, 27(4), 798-818.
doi: 10.1162/jocn_a_00734 URL |
[32] |
Huang X. Q., Zhang J., Liu J., Sun L., Zhao H. Y., Lu Y. G., ... Li J . (2012). C-reactive protein promotes adhesion of monocytes to endothelial cells via NADPH oxidase-mediated oxidative stress. Journal of Cellular Biochemistry, 113(3), 857-867.
doi: 10.1002/jcb.v113.3 URL |
[33] |
Jessen S., & Kotz S. A . (2011). The temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage, 58(2), 665-674.
doi: 10.1016/j.neuroimage.2011.06.035 URL |
[34] | Jia G., Peng X., Li Y., Hua S., & Zhao X. J . (2012). The oscillatory activities and its synchronization in auditory-visual integration as revealed by event-related potentials to bimodal stimuli. Proceedings of SPIE - The International Society for Optical Engineering, 8291(1), 52. |
[35] |
Jochen K., Ingo H., Hermann A., Klaus M., & Werner L . (2005). Hearing lips: Gamma-band activity during audiovisual speech perception. Cerebral Cortex, 15(5), 646-653.
doi: 10.1093/cercor/bhh166 URL |
[36] | Klasen M., Chen Y. H., & Mathiak K . (2012). Multisensory emotions: Perception, combination and underlying neural processes. Reviews in the Neurosciences, 23(4), 381-392. |
[37] | Klasen M., Kenworthy C. A., Mathiak K. A., Kircher T. T. J., & Mathiak K . (2011). Supramodal representation of emotions. Journal of Neuroscience, 31(38), 15218-15218. |
[38] | Klasen M., Kreifelts B., Chen Y. H., Seubert J., & Mathiak K . (2014). Neural processing of emotion in multimodal settings. Frontiers in Human Neuroscience, 8(8), 822. |
[39] |
Knowland V. C. P., Mercure E., Karmiloff-Smith A., Dick F., & Thomas M. S. C ., (2014). Audio-visual speech perception: A developmental ERP investigation. Developmental Science, 17(1), 110-124.
doi: 10.1111/desc.12098 URL |
[40] |
Kober H., Barrett L. F., Joseph J., Bliss-Moreau E., Lindquist K., & Wager T. D . (2008). Functional grouping and cortical-subcortical interactions in emotion: A meta-analysis of neuroimaging studies. Neuroimage, 42(2), 998-1031.
doi: 10.1016/j.neuroimage.2008.03.059 URL |
[41] |
Kokinous J., Kotz S. A., Tavano A., & Schroger E . (2015). The role of emotion in dynamic audiovisual integration of faces and voices. Social Cognitive and Affective Neuroscience, 10(5), 713-720.
doi: 10.1093/scan/nsu105 URL |
[42] |
Kokinous J., Tavano A., Kotz S. A., & Schroeger E . (2017). Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency. Biological Psychology, 123, 155-165.
doi: 10.1016/j.biopsycho.2016.12.007 URL |
[43] |
Kreifelts B., Ethofer T., Grodd W., Erb M., & Wildgruber D . (2007). Audiovisual integration of emotional signals in voice and face: An event-related fMRI study. Neuroimage, 37(4), 1445-1456.
doi: 10.1016/j.neuroimage.2007.06.020 URL |
[44] | Kreifelts B., Ethofer T., Huberle E., Grodd W., & Wildgruber D . (2010). Association of trait emotional intelligence and individual fMRI-activation patterns during the perception of social signals from voice and face. Human Brain Mapping, 31(7), 979-991. |
[45] |
Kreifelts B., Ethofer T., Shiozawa T., Grodd W., & Wildgruber D . (2009). Cerebral representation of non-verbal emotional perception: fMRI reveals audiovisual integration area between voice-and face-sensitive regions in the superior temporal sulcus. Neuropsychologia, 47(14), 3059-3066.
doi: 10.1016/j.neuropsychologia.2009.07.001 URL |
[46] |
Kuhn L. K., Wydell T., Lavan N., McGettigan C., & Garrido L . (2017). Similar representations of emotions across faces and voices. Emotion, 17(6), 912-937.
doi: 10.1037/emo0000282 URL |
[47] |
Kumar G. V., Kumar N., Roy D., & Banerjee A . (2018). Segregation and integration of cortical information processing underlying cross-modal perception. Multisensory Research, 31(5), 481-500.
doi: 10.1163/22134808-00002574 URL |
[48] |
Lin Y. F., Liu B. L., Liu Z. W., & Gao X. R . (2015). EEG gamma-band activity during audiovisual speech comprehension in different noise environments. Cognitive Neurodynamics, 9(4), 389-398.
doi: 10.1007/s11571-015-9333-5 URL |
[49] |
Liu P., Rigoulot S., & Pell M. D . (2015). Culture modulates the brain response to human expressions of emotion: Electrophysiological evidence. Neuropsychologia, 67, 1-13.
doi: 10.1016/j.neuropsychologia.2014.11.034 URL |
[50] |
Maier J. X., Chandrasekaran C., & Ghazanfar A. A . (2008). Integration of bimodal looming signals through neuronal coherence in the temporal lobe. Current Biology, 18(13), 963-968.
doi: 10.1016/j.cub.2008.05.043 URL |
[51] |
Mileva M., Tompkinson J., Watt D., & Burton A. M . (2018). Audiovisual integration in social evaluation. Journal of Experimental Psychology: Human Perception and Performance, 44(1), 128-138.
doi: 10.1037/xhp0000439 URL |
[52] |
Muller V. I., Cieslik E. C., Turetsky B. I., & Eickhoff S. B . (2012). Crossmodal interactions in audiovisual emotion processing. Neuroimage, 60(1), 553-561.
doi: 10.1016/j.neuroimage.2011.12.007 URL |
[53] |
Noy D., Mouta S., Lamas J., Basso D., Silva C., & Santos J. A . (2017). Audiovisual integration increases the intentional step synchronization of side-by-side walkers. Human Movement Science, 56, 71-87.
doi: 10.1016/j.humov.2017.10.007 URL |
[54] |
Olofsson J. K., & Polich J. , (2007). Affective visual event-related potentials: Arousal, repetition, and time-on-task. Biological Psychology, 75(1), 101-108.
doi: 10.1016/j.biopsycho.2006.12.006 URL |
[55] | Pan Z. H., Liu X., Luo Y. M., & Chen X. H . (2017). Emotional intensity modulates the integration of bimodal angry expressions: ERP evidence. Frontiers in Neuroscience, 11. |
[56] |
Park J. Y., Gu B. M., Kang D. H., Shin Y. W., Choi C. H., Lee J. M., & Kwon J. S . (2010). Integration of cross-modal emotional information in the human brain: An fMRI study. Cortex, 46(2), 161-169.
doi: 10.1016/j.cortex.2008.06.008 URL |
[57] |
Paulmann S., Jessen S., & Kotz S. A . (2009). Investigating the multimodal nature of human communication insights from ERPs. Journal of Psychophysiology, 23(2), 63-76.
doi: 10.1027/0269-8803.23.2.63 URL |
[58] | Paulmann S., & Pell M. D . (2010a). Contextual influences of emotional speech prosody on face processing: How much is enough? Cognitive Affective & Behavioral Neuroscience, 10(2), 230-242. |
[59] |
Paulmann S., & Pell M. D . (2010b). Dynamic emotion processing in Parkinson's disease as a function of channel availability. Journal of Clinical and Experimental Neuropsychology, 32(8), 822-835.
doi: 10.1080/13803391003596371 URL |
[60] |
Pourtois G., de Gelder B., Vroomen J., Rossion B., & Crommelinck M . (2000). The time-course of intermodal binding between seeing and hearing affective information. Neuroreport, 11(6), 1329-1333.
doi: 10.1097/00001756-200004270-00036 URL |
[61] |
Pourtois G., Debatisse D., Despland P. A., & de Gelder B . (2002). Facial expressions modulate the time course of long latency auditory brain potentials. Cognitive Brain Research, 14(1), 99-105.
doi: 10.1016/S0926-6410(02)00064-2 URL |
[62] |
Pourtois G., Thut G., de Peralta R. G., Michel C., & Vuilleumier P . (2005). Two electrophysiological stages of spatial orienting towards fearful faces: Early temporo-parietal activation preceding gain control in extrastriate visual cortex. Neuroimage, 26(1), 149-163.
doi: 10.1016/j.neuroimage.2005.01.015 URL |
[63] |
Proverbio A. M., & De Benedetto , F. (2018). Auditory enhancement of visual memory encoding is driven by emotional content of the auditory material and mediated by superior frontal cortex. Biological Psychology, 132, 164-175.
doi: 10.1016/j.biopsycho.2017.12.003 URL |
[64] |
Robins D. L., Hunyadi E., & Schultz R. T . (2009). Superior temporal activation in response to dynamic audio-visual emotional cues. Brain and Cognition, 69(2), 269-278.
doi: 10.1016/j.bandc.2008.08.007 URL |
[65] |
Romero Y. R., Senkowski D., & Keil J . (2015). Early and late beta-band power reflect audiovisual perception in the McGurk illusion. Journal of Neurophysiology, 113(7), 2342-2350.
doi: 10.1152/jn.00783.2014 URL |
[66] | Schelenz P. D., Klasen M., Reese B., Regenbogen C., Wolf D., Kato Y., & Mathiak K . (2013). Multisensory integration of dynamic emotional faces and voices: Method for simultaneous EEG-fMRI measurements. Frontiers in Human Neuroscience, 7(1), 729. |
[67] |
Schupp H. T., Stockburger J., Codispoti M., Junghoefer M., Weike A. I., & Hamm A. O . (2007). Selective visual attention to emotion. Journal of Neuroscience, 27(5), 1082-1089.
doi: 10.1523/JNEUROSCI.3223-06.2007 URL |
[68] |
Simon D. M., & Wallace M. T . (2018). Integration and temporal processing of asynchronous audiovisual speech. Journal of Cognitive Neuroscience, 30(3), 319-337.
doi: 10.1162/jocn_a_01205 URL |
[69] | Stein B. E., & Stanford T. R . (2008). Multisensory integration: Current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9(4), 255-266. |
[70] |
Stein B. E., Stanford T. R., Ramachandran R., de Perrault T. J., & Rowland B. A . (2009). Challenges in quantifying multisensory integration: Alternative criteria, models, and inverse effectiveness. Experimental Brain Research, 198(2-3), 113-126.
doi: 10.1007/s00221-009-1880-8 URL |
[71] |
Strelnikov K., Foxton J., Marx M., & Barone P . (2015). Brain prediction of auditory emphasis by facialS expressions during audiovisual continuous speech. Brain Topography, 28(3), 494-505.
doi: 10.1007/s10548-013-0338-2 URL |
[72] | Symons A. E., El-Deredy W., Schwartze M., & Kotz S. A . (2016). The functional role ofneural oscillations in non-verbal emotional communication. Frontiers in Human Neuroscience, 10, 239. |
[73] |
Tallon-Baudry C., & Bertrand O. , (1999). Oscillatory gamma activity in humans and its role in object representation. Trends in Cognitive Sciences, 3(4), 151-162.
doi: 10.1016/S1364-6613(99)01299-1 URL |
[74] |
Tang X. Y., Wu J. L., & Shen Y . (2016). The interactions of multisensory integration with endogenous and exogenous attention. Neuroscience and Biobehavioral Reviews, 61, 208-224.
doi: 10.1016/j.neubiorev.2015.11.002 URL |
[75] |
Van Kleef, G. A . (2009). How emotions regulate social life: The emotions as social information (EASI) model. Current Directions in Psychological Science, 18(3), 184-188.
doi: 10.1111/j.1467-8721.2009.01633.x URL |
[76] |
van Wassenhove V., Grant K. W., & Poeppel D . (2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the United States of America, 102(4), 1181-1186.
doi: 10.1073/pnas.0408949102 URL |
[77] |
Yang C. Y., & Lin C. P . (2017). Magnetoencephalography study of different relationships among low- and high-frequency-band neural activities during the induction of peaceful and fearful audiovisual modalities among males and females. Journal of Neuroscience Research, 95(1-2), 176-188.
doi: 10.1002/jnr.23885 URL |
[78] | Yaple Z. A., Vakhrushev R., & Jolij J . (2016). Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices Frontiers in Neuroscience, 10, 305. |
[79] |
Yeh P. w., Geangu E., & Reid V . (2016). Coherent emotional perception from body expressions and the voice. Neuropsychologia, 91, 99-108.
doi: 10.1016/j.neuropsychologia.2016.07.038 URL |
[80] |
Zhu L. L., & Beauchamp M. S . (2017). Mouth and voice: A relationship between visual and auditory preference in the human superior temporal sulcus. Journal of Neuroscience, 37(10), 2697-2708.
doi: 10.1523/JNEUROSCI.2914-16.2017 URL |
[81] | Zinchenko A., Obermeier C., Kanske P., Schroger E., & Kotz S. A . (2017). Positive emotion impedes emotional but not cognitive conflict processing. Cognitive Affective & Behavioral Neuroscience, 17(3), 665-677. |
[1] | YE Shuqi, YIN Junting, LI Zhaoxian, LUO Junlong. The influence mechanism of emotion on intuitive and analytical processing [J]. Advances in Psychological Science, 2023, 31(5): 736-746. |
[2] | LIU Wenhua, WEN Xiujuan, CHEN Ling, YANG Rui, HU Yiru. Reward-anticipation and outcome-evaluation ERPs and its application in psychiatric disorders [J]. Advances in Psychological Science, 2023, 31(5): 783-799. |
[3] | GUO Li, JIA Suosuo, LI Guiquan, LI Manlin. Lonely at the top? Exploring the multi-level double-edged sword effect of leader workplace loneliness [J]. Advances in Psychological Science, 2023, 31(4): 582-596. |
[4] | MA Yuanxiao, CHEN Xu. The functional mechanism of oxytocin in anxiety detection and extinction among anxiety-susceptible groups [J]. Advances in Psychological Science, 2023, 31(1): 10-19. |
[5] | XIAO Tingwei, DONG Jie, LIANG Fei, WANG Fushun, LI Yang. The relationship between disgust and suicidal behavior [J]. Advances in Psychological Science, 2023, 31(1): 87-98. |
[6] | KOU Dongxiao, WANG Xiaoyu. The influence of power on interpersonal sensitivity [J]. Advances in Psychological Science, 2023, 31(1): 108-115. |
[7] | ZOU Di, LI Hong, WANG Fushun. An investigation into the definition of arousal and its cognitive neurophysiological basis [J]. Advances in Psychological Science, 2022, 30(9): 2020-2033. |
[8] | ZHANG Siyuan, LI Xuebing. The application of different frequencies of transcranial alternating current stimulation in mental disorders [J]. Advances in Psychological Science, 2022, 30(9): 2053-2066. |
[9] | DAI Xiaoyan, HU Yi, ZHANG Ya. Interpersonal synchrony: A new perspective to elucidate the essence of working alliance in psychological counseling [J]. Advances in Psychological Science, 2022, 30(9): 2078-2087. |
[10] | LIU Wei, SHEN Xiaoling. A dynamic perspective on the relationship between team reflection-in-action and innovation: A moderated mediation model integrating cognition and emotion [J]. Advances in Psychological Science, 2022, 30(8): 1759-1769. |
[11] | CHEN Liangjie, LIU Lei, GE Zhongshu, YANG Xiaodong, LI Liang. The role of rhythm in auditory speech understanding [J]. Advances in Psychological Science, 2022, 30(8): 1818-1831. |
[12] | LIANG Fei, JIANG Yao, XIAO Tingwei, DONG Jie, WANG Fushun. Basic emotion and its neural basis: Evidence from fMRI and machine-vision studies [J]. Advances in Psychological Science, 2022, 30(8): 1832-1843. |
[13] | LIU Chunxiao, LIU Lizhi, Wang Dan, CHEN Wenfeng. The mechanism of collective ritual promoting group emotional contagion [J]. Advances in Psychological Science, 2022, 30(8): 1870-1882. |
[14] | CHEN Xiao, XIE Bin, PENG Jian, NIE Qi. The antecedents and consequences of workplace loneliness: A regulatory focus theory perspective [J]. Advances in Psychological Science, 2022, 30(7): 1463-1481. |
[15] | WANG Yanqing, GONG Shaoying, JIANG Tiantian, Wu Yanan. Can affective pedagogical agent facilitate multimedia learning? [J]. Advances in Psychological Science, 2022, 30(7): 1524-1535. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||