Please wait a minute...
Advances in Psychological Science    2019, Vol. 27 Issue (7) : 1205-1214     DOI: 10.3724/SP.J.1042.2019.01205
Regular Articles |
The integration of facial expression and vocal emotion and its brain mechanism
LI Ping,ZHANG Mingming,LI Shuaixia,ZHANG Huoyin,LUO Wenbo()
Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China
Download: PDF(573 KB)   HTML
Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks     Supporting Info
Guide   
Abstract  

The integration of various emotional information from different modalities (e.g., face and voice) plays an important role in our interpersonal communication. In order to understand its brain mechanism, more and more researchers found that the interaction between facial expression and vocal emotional information begins in the early stage of perception, and the integration of emotional information content occurs in the late decision-making stage. In the early stage, the primary sensory cortex is responsible for encoding information; while in the late stage, the amygdala, temporal lobe and other advanced brain regions are responsible for cognitive evaluation. In addition, the functional coupling of oscillation activities on multiple frequency bands facilitates the integration of emotional information cross channels. Future research needs to explore whether facial expression and vocal emotional information integration is associated with emotional conflict, and whether inconsistent emotional information has advantages. Lastly, we should find out how the neural oscillations of different frequency bands promotes the integration of facial expression and vocal emotional information, so as to further understand its dynamic basis.

Keywords emotion      integration effect      ERP      neural oscillation      fMRI     
ZTFLH:  B842  
Corresponding Authors: Wenbo LUO     E-mail: luowb@lnnu.edu.cn
Issue Date: 22 May 2019
Service
E-mail this article
E-mail Alert
RSS
Articles by authors
Ping LI
Mingming ZHANG
Shuaixia LI
Huoyin ZHANG
Wenbo LUO
Cite this article:   
Ping LI,Mingming ZHANG,Shuaixia LI, et al. The integration of facial expression and vocal emotion and its brain mechanism[J]. Advances in Psychological Science, 2019, 27(7): 1205-1214.
URL:  
http://journal.psych.ac.cn/xlkxjz/EN/10.3724/SP.J.1042.2019.01205     OR     http://journal.psych.ac.cn/xlkxjz/EN/Y2019/V27/I7/1205
[1] 张亮, 孙向红, 张侃 . (2009). 情绪信息的多通道整合. 心理科学进展, 17(6), 1133-1138.
[2] 王苹, 潘治辉, 张立洁, 陈煦海 . (2015). 动态面孔和语音情绪信息的整合加工及神经生理机制. 心理科学进展, 23(7), 1109-1117.
[3] Armony J. L., & Dolan R. J . (2000). Modulation of attention by threat stimuli: An fMRI study. Journal of Cognitive Neuroscience, 53-53.
[4] Balconi M., & Carrera A. , (2011). Cross-modal integration of emotional face and voice in congruous and incongruous pairs: The P2 ERP effect. Journal of Cognitive Psychology, 23(1), 132-139.
url: http://www.tandfonline.com/doi/abs/10.1080/20445911.2011.473560
[5] Belyk M., Brown S., Lim J., & Kotz S. A . (2017). Convergence of semantics and emotional expression within the IFG pars orbitalis. Neuroimage, 156, 240-248.
url: https://linkinghub.elsevier.com/retrieve/pii/S1053811917303129
[6] Calvo M. G., Beltran D., & Fernandez-Martin A . (2014). Processing of facial expressions in peripheral vision: Neurophysiological evidence. Biological Psychology, 100, 60-70.
url: https://linkinghub.elsevier.com/retrieve/pii/S0301051114001069
[7] Calvo M. G., & Nummenmaa L. , (2016). Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6), 1081-1106.
url: http://www.tandfonline.com/doi/full/10.1080/02699931.2015.1049124
[8] Campanella S., & Belin P. , (2007). Integrating face and voice in person perception. Trends in Cognitive Sciences, 11(12), 535-543.
url: https://linkinghub.elsevier.com/retrieve/pii/S1364661307002495
[9] Campanella S., Bruyer R., Froidbise S., Rossignol M., Joassin F., Kornreich C., ... Verbanck P . (2010). Is two better than one? A cross-modal oddball paradigm reveals greater sensitivity of the P300 to emotional face-voice associations. Clinical Neurophysiology, 121(11), 1855-1862.
url: https://linkinghub.elsevier.com/retrieve/pii/S138824571000355X
[10] Chen X. H., Edgar J. C., Holroyd T., Dammers J., Thoennessen H., Roberts T. P. L., & Mathiak K . (2010). Neuromagnetic oscillations to emotional faces and prosody. European Journal of Neuroscience, 31(10), 1818-1827.
url: http://blackwell-synergy.com/doi/abs/10.1111/ejn.2010.31.issue-10
[11] Chen X. H., Han L. Z., Pan Z. H., Luo Y. M., & Wang P . (2016). Influence of attention on bimodal integration during emotional change decoding: ERP evidence. International Journal of Psychophysiology, 106, 14-20.
url: https://linkinghub.elsevier.com/retrieve/pii/S0167876016301027
[12] Chen X. H., Pan Z. H., Wang P., Yang X. H., Liu P., You X. Q., & Yuan J. J . (2016). The integration of facial and vocal cues during emotional change perception: EEG markers. Social Cognitive and Affective Neuroscience, 11(7), 1152-1161.
url: https://academic.oup.com/scan/article-lookup/doi/10.1093/scan/nsv083
[13] Chen X. H., Pan Z. H., Wang P., Zhang L. J., & Yuan J. J . (2015). EEG oscillations reflect task effects for the change detection in vocal emotion. Cognitive Neurodynamics, 9(3), 351-358.
url: http://link.springer.com/10.1007/s11571-014-9326-9
[14] Chen X. H., Yang J. F., Gan S. Z., & Yang Y. F . (2012). The contribution of sound intensity in vocal emotion perception: Behavioral and electrophysiological evidence. PLoS One, 7(1), e30278.
url: https://dx.plos.org/10.1371/journal.pone.0030278
[15] Collignon O., Girard S., Gosselin F., Roy S., Saint-Amour D., Lassonde M., & Lepore F . (2008). Audio-visual integration of emotion expression. Brain Research, 1242, 126-135.
url: https://linkinghub.elsevier.com/retrieve/pii/S0006899308009372
[16] Cuthbert B. N., Schupp H. T., Bradley M. M., Birbaumer N., & Lang P. J . (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95-111.
url: https://linkinghub.elsevier.com/retrieve/pii/S0301051199000447
[17] de Gelder B., & Vroomen J., (2000). The perception of emotions by ear and by eye. Cognition and Emotion, 14(3), 289-311.
url: http://www.tandfonline.com/doi/abs/10.1080/026999300378824
[18] Delle-Vigne D., Kornreich C., Verbanck P., & Campanella S . (2015). The P300 component wave reveals differences in subclinical anxious-depressive states during bimodal oddball tasks: An effect of stimulus congruence. Clinical Neurophysiology, 126(11), 2108-2123.
url: https://linkinghub.elsevier.com/retrieve/pii/S1388245715000668
[19] Ding R., Li P., Wang W., & Luo W . (2017). Emotion processing by ERP combined with development and plasticity. Neural Plasticity, 2017(2), 5282670.
[20] Doi H., & Shinohara K. , (2015). Unconscious presentation of fearful face modulates electrophysiological responses to emotional prosody. Cerebral Cortex, 25(3), 817-832.
url: https://academic.oup.com/cercor/article-lookup/doi/10.1093/cercor/bht282
[21] Dolan R. J., Morris J. S., & .. Gelder B . (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences of the United States of America, 98(17), 10006-10010.
url: http://www.pnas.org/cgi/doi/10.1073/pnas.171288598
[22] Epperson C. N., Amin Z., Ruparel K., Gur R., & Loughead J . (2012). Interactive effects of estrogen and serotonin on brain activation during working memory and affective processing in menopausal women. Psychoneuroendocrinology, 37(3), 372-382.
url: https://linkinghub.elsevier.com/retrieve/pii/S0306453011001922
[23] Ethofer T., Anders S., Erb M., Herbert C., Wiethoff S., Kissler J., ... Wildgruber D . (2006). Cerebral pathways in processing of affective prosody: A dynamic causal modeling study. Neuroimage, 30(2), 580-597.
url: https://linkinghub.elsevier.com/retrieve/pii/S1053811905007287
[24] Ethofer T., Pourtois G., & Wildgruber D . (2006). Investigating audiovisual integration of emotional signals in the human brain. Progress in Brain Research, 156(6), 345-361.
url: https://linkinghub.elsevier.com/retrieve/pii/S0079612306560194
[25] Fingelkurts A. A., Fingelkurts A. A., & Seppo K. H. N ., (2005). Functional connectivity in the brain--Is it an elusive concept? Neuroscience & Biobehavioral Reviews, 28(8), 827-836.
[26] Focker J., Gondan M., & Roder B . (2011). Preattentive processing of audio-visual emotional signals. Acta Psychologica, 137(1), 36-47.
url: https://linkinghub.elsevier.com/retrieve/pii/S0001691811000357
[27] Gao Z. F., Goldstein A., Harpaz Y., Hansel M., Zion-Golumbic E., & Bentin S . (2013). A magnetoencephalographic study of face processing: M170, gamma-band oscillations and source localization. Human Brain Mapping, 34(8), 1783-1795.
url: http://doi.wiley.com/10.1002/hbm.v34.8
[28] Hagan C. C., Woods W., Johnson S., Calder A. J., Green G. G. R., & Young A. W . (2009). MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcus. Proceedings of the National Academy of Sciences of the United States of America, 106(47), 20010-20015.
url: http://www.pnas.org/cgi/doi/10.1073/pnas.0905792106
[29] Hagan C. C., Woods W., Johnson S., Green G. G. R., & Young A. W . (2013). Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG. PLoS One, 8(8), e70648.
url: https://dx.plos.org/10.1371/journal.pone.0070648
[30] Hernandez-Gutierrez D., Abdel Rahman R., Martin-Loeches M., Munoz F., Schacht A., & Sommer W . (2018). Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence. Cortex, 104, 12-25.
url: https://linkinghub.elsevier.com/retrieve/pii/S0010945218301229
[31] Ho H. T., Schroger E., & Kotz S. A . (2015). Selective attention modulates early human evoked potentials during emotional face-voice processing. Journal of Cognitive Neuroscience, 27(4), 798-818.
url: http://www.mitpressjournals.org/doi/10.1162/jocn_a_00734
[32] Huang X. Q., Zhang J., Liu J., Sun L., Zhao H. Y., Lu Y. G., ... Li J . (2012). C-reactive protein promotes adhesion of monocytes to endothelial cells via NADPH oxidase-mediated oxidative stress. Journal of Cellular Biochemistry, 113(3), 857-867.
url: http://doi.wiley.com/10.1002/jcb.v113.3
[33] Jessen S., & Kotz S. A . (2011). The temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage, 58(2), 665-674.
url: https://linkinghub.elsevier.com/retrieve/pii/S1053811911006537
[34] Jia G., Peng X., Li Y., Hua S., & Zhao X. J . (2012). The oscillatory activities and its synchronization in auditory-visual integration as revealed by event-related potentials to bimodal stimuli. Proceedings of SPIE - The International Society for Optical Engineering, 8291(1), 52.
[35] Jochen K., Ingo H., Hermann A., Klaus M., & Werner L . (2005). Hearing lips: Gamma-band activity during audiovisual speech perception. Cerebral Cortex, 15(5), 646-653.
url: http://academic.oup.com/cercor/article/15/5/646/442219/Hearing-Lips-Gammaband-Activity-During-Audiovisual
[36] Klasen M., Chen Y. H., & Mathiak K . (2012). Multisensory emotions: Perception, combination and underlying neural processes. Reviews in the Neurosciences, 23(4), 381-392.
[37] Klasen M., Kenworthy C. A., Mathiak K. A., Kircher T. T. J., & Mathiak K . (2011). Supramodal representation of emotions. Journal of Neuroscience, 31(38), 15218-15218.
[38] Klasen M., Kreifelts B., Chen Y. H., Seubert J., & Mathiak K . (2014). Neural processing of emotion in multimodal settings. Frontiers in Human Neuroscience, 8(8), 822.
[39] Knowland V. C. P., Mercure E., Karmiloff-Smith A., Dick F., & Thomas M. S. C ., (2014). Audio-visual speech perception: A developmental ERP investigation. Developmental Science, 17(1), 110-124.
url: http://doi.wiley.com/10.1111/desc.12098
[40] Kober H., Barrett L. F., Joseph J., Bliss-Moreau E., Lindquist K., & Wager T. D . (2008). Functional grouping and cortical-subcortical interactions in emotion: A meta-analysis of neuroimaging studies. Neuroimage, 42(2), 998-1031.
url: https://linkinghub.elsevier.com/retrieve/pii/S1053811908002942
[41] Kokinous J., Kotz S. A., Tavano A., & Schroger E . (2015). The role of emotion in dynamic audiovisual integration of faces and voices. Social Cognitive and Affective Neuroscience, 10(5), 713-720.
url: https://academic.oup.com/scan/article/10/5/713/1666664
[42] Kokinous J., Tavano A., Kotz S. A., & Schroeger E . (2017). Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency. Biological Psychology, 123, 155-165.
url: https://linkinghub.elsevier.com/retrieve/pii/S0301051116303702
[43] Kreifelts B., Ethofer T., Grodd W., Erb M., & Wildgruber D . (2007). Audiovisual integration of emotional signals in voice and face: An event-related fMRI study. Neuroimage, 37(4), 1445-1456.
url: https://linkinghub.elsevier.com/retrieve/pii/S1053811907005757
[44] Kreifelts B., Ethofer T., Huberle E., Grodd W., & Wildgruber D . (2010). Association of trait emotional intelligence and individual fMRI-activation patterns during the perception of social signals from voice and face. Human Brain Mapping, 31(7), 979-991.
[45] Kreifelts B., Ethofer T., Shiozawa T., Grodd W., & Wildgruber D . (2009). Cerebral representation of non-verbal emotional perception: fMRI reveals audiovisual integration area between voice-and face-sensitive regions in the superior temporal sulcus. Neuropsychologia, 47(14), 3059-3066.
url: https://linkinghub.elsevier.com/retrieve/pii/S0028393209002930
[46] Kuhn L. K., Wydell T., Lavan N., McGettigan C., & Garrido L . (2017). Similar representations of emotions across faces and voices. Emotion, 17(6), 912-937.
url: http://doi.apa.org/getdoi.cfm?doi=10.1037/emo0000282
[47] Kumar G. V., Kumar N., Roy D., & Banerjee A . (2018). Segregation and integration of cortical information processing underlying cross-modal perception. Multisensory Research, 31(5), 481-500.
url: https://brill.com/view/journals/msr/31/5/article-p481_8.xml
[48] Lin Y. F., Liu B. L., Liu Z. W., & Gao X. R . (2015). EEG gamma-band activity during audiovisual speech comprehension in different noise environments. Cognitive Neurodynamics, 9(4), 389-398.
url: http://link.springer.com/10.1007/s11571-015-9333-5
[49] Liu P., Rigoulot S., & Pell M. D . (2015). Culture modulates the brain response to human expressions of emotion: Electrophysiological evidence. Neuropsychologia, 67, 1-13.
url: https://linkinghub.elsevier.com/retrieve/pii/S0028393214004540
[50] Maier J. X., Chandrasekaran C., & Ghazanfar A. A . (2008). Integration of bimodal looming signals through neuronal coherence in the temporal lobe. Current Biology, 18(13), 963-968.
url: https://linkinghub.elsevier.com/retrieve/pii/S0960982208007148
[51] Mileva M., Tompkinson J., Watt D., & Burton A. M . (2018). Audiovisual integration in social evaluation. Journal of Experimental Psychology: Human Perception and Performance, 44(1), 128-138.
url: http://doi.apa.org/getdoi.cfm?doi=10.1037/xhp0000439
[52] Muller V. I., Cieslik E. C., Turetsky B. I., & Eickhoff S. B . (2012). Crossmodal interactions in audiovisual emotion processing. Neuroimage, 60(1), 553-561.
url: https://linkinghub.elsevier.com/retrieve/pii/S105381191101408X
[53] Noy D., Mouta S., Lamas J., Basso D., Silva C., & Santos J. A . (2017). Audiovisual integration increases the intentional step synchronization of side-by-side walkers. Human Movement Science, 56, 71-87.
url: https://linkinghub.elsevier.com/retrieve/pii/S0167945717304633
[54] Olofsson J. K., & Polich J. , (2007). Affective visual event-related potentials: Arousal, repetition, and time-on-task. Biological Psychology, 75(1), 101-108.
url: https://linkinghub.elsevier.com/retrieve/pii/S0301051107000026
[55] Pan Z. H., Liu X., Luo Y. M., & Chen X. H . (2017). Emotional intensity modulates the integration of bimodal angry expressions: ERP evidence. Frontiers in Neuroscience, 11.
[56] Park J. Y., Gu B. M., Kang D. H., Shin Y. W., Choi C. H., Lee J. M., & Kwon J. S . (2010). Integration of cross-modal emotional information in the human brain: An fMRI study. Cortex, 46(2), 161-169.
url: https://linkinghub.elsevier.com/retrieve/pii/S001094520800172X
[57] Paulmann S., Jessen S., & Kotz S. A . (2009). Investigating the multimodal nature of human communication insights from ERPs. Journal of Psychophysiology, 23(2), 63-76.
url: https://econtent.hogrefe.com/doi/10.1027/0269-8803.23.2.63
[58] Paulmann S., & Pell M. D . (2010a). Contextual influences of emotional speech prosody on face processing: How much is enough? Cognitive Affective & Behavioral Neuroscience, 10(2), 230-242.
[59] Paulmann S., & Pell M. D . (2010b). Dynamic emotion processing in Parkinson's disease as a function of channel availability. Journal of Clinical and Experimental Neuropsychology, 32(8), 822-835.
url: https://www.tandfonline.com/doi/full/10.1080/13803391003596371
[60] Pourtois G., de Gelder B., Vroomen J., Rossion B., & Crommelinck M . (2000). The time-course of intermodal binding between seeing and hearing affective information. Neuroreport, 11(6), 1329-1333.
url: https://insights.ovid.com/crossref?an=00001756-200004270-00036
[61] Pourtois G., Debatisse D., Despland P. A., & de Gelder B . (2002). Facial expressions modulate the time course of long latency auditory brain potentials. Cognitive Brain Research, 14(1), 99-105.
url: https://linkinghub.elsevier.com/retrieve/pii/S0926641002000642
[62] Pourtois G., Thut G., de Peralta R. G., Michel C., & Vuilleumier P . (2005). Two electrophysiological stages of spatial orienting towards fearful faces: Early temporo-parietal activation preceding gain control in extrastriate visual cortex. Neuroimage, 26(1), 149-163.
url: https://linkinghub.elsevier.com/retrieve/pii/S1053811905000303
[63] Proverbio A. M., & De Benedetto , F. (2018). Auditory enhancement of visual memory encoding is driven by emotional content of the auditory material and mediated by superior frontal cortex. Biological Psychology, 132, 164-175.
url: https://linkinghub.elsevier.com/retrieve/pii/S0301051117303447
[64] Robins D. L., Hunyadi E., & Schultz R. T . (2009). Superior temporal activation in response to dynamic audio-visual emotional cues. Brain and Cognition, 69(2), 269-278.
url: https://linkinghub.elsevier.com/retrieve/pii/S0278262608002492
[65] Romero Y. R., Senkowski D., & Keil J . (2015). Early and late beta-band power reflect audiovisual perception in the McGurk illusion. Journal of Neurophysiology, 113(7), 2342-2350.
url: http://www.physiology.org/doi/10.1152/jn.00783.2014
[66] Schelenz P. D., Klasen M., Reese B., Regenbogen C., Wolf D., Kato Y., & Mathiak K . (2013). Multisensory integration of dynamic emotional faces and voices: Method for simultaneous EEG-fMRI measurements. Frontiers in Human Neuroscience, 7(1), 729.
[67] Schupp H. T., Stockburger J., Codispoti M., Junghoefer M., Weike A. I., & Hamm A. O . (2007). Selective visual attention to emotion. Journal of Neuroscience, 27(5), 1082-1089.
url: http://www.jneurosci.org/cgi/doi/10.1523/JNEUROSCI.3223-06.2007
[68] Simon D. M., & Wallace M. T . (2018). Integration and temporal processing of asynchronous audiovisual speech. Journal of Cognitive Neuroscience, 30(3), 319-337.
url: https://www.mitpressjournals.org/doi/abs/10.1162/jocn_a_01205
[69] Stein B. E., & Stanford T. R . (2008). Multisensory integration: Current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9(4), 255-266.
[70] Stein B. E., Stanford T. R., Ramachandran R., de Perrault T. J., & Rowland B. A . (2009). Challenges in quantifying multisensory integration: Alternative criteria, models, and inverse effectiveness. Experimental Brain Research, 198(2-3), 113-126.
url: http://link.springer.com/10.1007/s00221-009-1880-8
[71] Strelnikov K., Foxton J., Marx M., & Barone P . (2015). Brain prediction of auditory emphasis by facialS expressions during audiovisual continuous speech. Brain Topography, 28(3), 494-505.
url: http://link.springer.com/10.1007/s10548-013-0338-2
[72] Symons A. E., El-Deredy W., Schwartze M., & Kotz S. A . (2016). The functional role ofneural oscillations in non-verbal emotional communication. Frontiers in Human Neuroscience, 10, 239.
[73] Tallon-Baudry C., & Bertrand O. , (1999). Oscillatory gamma activity in humans and its role in object representation. Trends in Cognitive Sciences, 3(4), 151-162.
url: https://linkinghub.elsevier.com/retrieve/pii/S1364661399012991
[74] Tang X. Y., Wu J. L., & Shen Y . (2016). The interactions of multisensory integration with endogenous and exogenous attention. Neuroscience and Biobehavioral Reviews, 61, 208-224.
url: https://linkinghub.elsevier.com/retrieve/pii/S0149763415002730
[75] Van Kleef, G. A . (2009). How emotions regulate social life: The emotions as social information (EASI) model. Current Directions in Psychological Science, 18(3), 184-188.
url: http://journals.sagepub.com/doi/10.1111/j.1467-8721.2009.01633.x
[76] van Wassenhove V., Grant K. W., & Poeppel D . (2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the United States of America, 102(4), 1181-1186.
url: http://www.pnas.org/cgi/doi/10.1073/pnas.0408949102
[77] Yang C. Y., & Lin C. P . (2017). Magnetoencephalography study of different relationships among low- and high-frequency-band neural activities during the induction of peaceful and fearful audiovisual modalities among males and females. Journal of Neuroscience Research, 95(1-2), 176-188.
url: http://doi.wiley.com/10.1002/jnr.23885
[78] Yaple Z. A., Vakhrushev R., & Jolij J . (2016). Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices Frontiers in Neuroscience, 10, 305.
[79] Yeh P. w., Geangu E., & Reid V . (2016). Coherent emotional perception from body expressions and the voice. Neuropsychologia, 91, 99-108.
url: https://linkinghub.elsevier.com/retrieve/pii/S0028393216302822
[80] Zhu L. L., & Beauchamp M. S . (2017). Mouth and voice: A relationship between visual and auditory preference in the human superior temporal sulcus. Journal of Neuroscience, 37(10), 2697-2708.
url: http://www.jneurosci.org/lookup/doi/10.1523/JNEUROSCI.2914-16.2017
[81] Zinchenko A., Obermeier C., Kanske P., Schroger E., & Kotz S. A . (2017). Positive emotion impedes emotional but not cognitive conflict processing. Cognitive Affective & Behavioral Neuroscience, 17(3), 665-677.
[1] SUN Jianqun, TIAN Xiaoming, Li Rui. Negative effects and mechanisms of emotional intelligence[J]. Advances in Psychological Science, 2019, 27(8): 1451-1459.
[2] ZHANG Kunkun, ZHANG Keye, ZHANG Huoyin, LUO Wenbo. The time course and influence factors of facial trustworthiness processing[J]. Advances in Psychological Science, 2019, 27(8): 1394-1403.
[3] SU Yanjie, XIE Dongjie, WANG Xiaonan. The role of cognitive control in third-party punishment[J]. Advances in Psychological Science, 2019, 27(8): 1331-1343.
[4] GUO Rong,FU Xinyuan. Social class signs and their effects on interpersonal social interactions[J]. Advances in Psychological Science, 2019, 27(7): 1268-1274.
[5] GONG Yanping,CHEN Zhuo,XIE Julan,XIE Xiaochun. Phubbing: Antecedents, consequences and functioning mechanisms[J]. Advances in Psychological Science, 2019, 27(7): 1258-1267.
[6] LI Ying,ZHANG Can,WANG Yue. The effect of moral emotions on the metaphorical mapping of morality and its neural mechanism[J]. Advances in Psychological Science, 2019, 27(7): 1224-1231.
[7] SUN Lianrong,WANG Pei. Theory construction on the psychological mechanism of the harmonious doctor-patient relationship and its promoting technology[J]. Advances in Psychological Science, 2019, 27(6): 951-964.
[8] SUN Qingzhou,WU Qingyuan,ZHANG Jing,JIANG Chengming,ZHAO Lei,HU Fengpei. Probability weighting bias in risky decision making: Psychological mechanism and optimizing strategies[J]. Advances in Psychological Science, 2019, 27(5): 905-913.
[9] ZHAO Jun,YAN Miao,XIAO Sufang,ZHANG Yongjun. The mutual relationship of organizational citizenship behaviors and counterproductive work behaviors: An integrated process of emotion and cognition[J]. Advances in Psychological Science, 2019, 27(5): 871-883.
[10] MENG Yuqi,XING Cai,LIU Xinhui. The ending effect of negative-expected value gambles: Emotional motivation induces risk taking[J]. Advances in Psychological Science, 2019, 27(5): 789-795.
[11] LIU Fang,LIU Wen,YU Tengxu. Relation between emotion regulation and child problem behaviors based on the perspective of temperament[J]. Advances in Psychological Science, 2019, 27(4): 646-656.
[12] QIN Haixia,ZHAO Wenrui,YU Jing,LEI Xu. Altered resting-state brain networks in insomnia: Functional connectivities within and between networks[J]. Advances in Psychological Science, 2019, 27(2): 289-300.
[13] BAI Xuejun,ZHANG Peng,ZHANG Qihan,SONG Lu,YANG Yu. Applications of functional near-infrared spectroscopy to lying researches[J]. Advances in Psychological Science, 2019, 27(1): 160-170.
[14] Chen LIU,Xu CHEN. Differences in autobiographical memory retrieval from the perspective of attachment[J]. Advances in Psychological Science, 2018, 26(9): 1590-1599.
[15] Jingyi LU,Xuesong SHANG. Making decisions for others: Multi-dimensional psychological mechanisms and decision feelings[J]. Advances in Psychological Science, 2018, 26(9): 1545-1552.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
Copyright © Advances in Psychological Science
Support by Beijing Magtech