Please wait a minute...
Acta Psychologica Sinica    2018, Vol. 50 Issue (5) : 483-493     DOI: 10.3724/SP.J.1041.2018.00483
 Visual and auditory verbal working memory affects visual attention in the semantic matching
 LI Biqin1; LI Ling1; WANG Aijun2; ZHANG Ming2
 (1 Lab of Psychology and Cognition Science of Jiangxi, School of Psychology, Jiangxi Normal University, Nanchang 330022, China) (2 Department of Psychology, Research Center for Psychological and Behavioral Sciences, Soochow University, Suzhou 215123, China)
Download: PDF(1512 KB)   Review File (1 KB) 
Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks     Supporting Info
Abstract   Previous studies have showed that information held in working memory (WM) can guide or capture attention during visual search in a relatively automatic way, even when it is irrelevant and detrimental to current task performance. Some researchers have proposed that the semantic match between WM contents and distractors could also capture attention, as well as the perceptual match. As we known, the verbal WM contents can be stored in the visual and auditory inputs. Even though the automatic influence of visual verbal WM on visual attention have been demonstrated, it remains unknown whether the auditory verbal WM could automatically capture attention. Therefore, it is necessary to investigate the attention guidance by the verbal WM contents. The present study included two experiments to explore the questions presented above. In Experiment 1, the memory item was a verbal Chinese character that presented visually, denoting a color, such as “红”. The participants were instructed to remember the word and avoid the potential distractors. Subsequently, they completed a visual search task, in order to test whether the verbal WM contents could guide attention. The results showed that, compared with the control condition, the visual search RTs were longer in the perceptual-matching and semantic-matching conditions, and the same as the RTs in the fastest trials. With the memory item that never matched the target in the search task, we suggested that the verbal WM contents that were presented visually (vis-VWM) could capture attention at perceptual and semantic levels automatically. In Experiment 2, the memory item was presented by the auditory inputs via the headphones (audi-VWM). The results showed that the visual search RTs in the semantic-matching condition were shorter than RTs in the control and perceptual-matching conditions, and there was no significant difference in the other conditions. Meanwhile, compared the shortest RTs across the different conditions, the results showed that the RTs in the semantic-matching condition were longer than in the control condition, which suggested that the aurally presented verbal WM could capture attention at the semantic level in the fastest response trials. In conclusion, the present study demonstrated that the verbal working memory that presented visually could automatically capture attention at both perceptual and semantic levels, and also verified the hypothesis that the attention capture effect would occur at the early stages of attention. However, the contents of verbal working memory would always capture attention at the earlier processing stage and could only be rejected at the later processing stage when the contents were aurally presented. Due to the modality specificity, attentional resources would be distributed to different sensory modalities. The memory-matching distractors could be rejected at the later processing stage because of there were the sufficient cognitive resources.
Keywords attention capture      verbal working memory      visual search      semantic matching     
Corresponding Authors: WANG Aijun, E-mail:; LI Biqin, E-mail:     E-mail: E-mail:; E-mail:
Issue Date: 31 March 2018
E-mail this article
E-mail Alert
Articles by authors
LI Biqin
LI Ling
WANG Aijun
Cite this article:   
LI Biqin,LI Ling,WANG Aijun, et al.  Visual and auditory verbal working memory affects visual attention in the semantic matching[J]. Acta Psychologica Sinica, 2018, 50(5): 483-493.
URL:     OR
[1] YUAN Xiaojun, CUI Xiaoxia, CAO Zhengcao, KAN Hong, WANG Xiao, WANG Yamin.  Attentional bias towards threatening visual stimuli in a virtual reality-based visual search task[J]. Acta Psychologica Sinica, 2018, 50(6): 622-636.
[2] LI Yangzhuo, QIAN Haoyue, ZHU Ming, GAO Xiangping.  Self association facilitates attentional inhibition in human visual search[J]. Acta Psychologica Sinica, 2018, 50(1): 28-35.
[3] ZHANG Bao; HU Cenlou; Huang Sai. What do eye movements reveal about the role of cognitive control in attention guidance from working memory representation[J]. Acta Psychologica Sinica, 2016, 48(9): 1105-1118.
[4] MU Bingbing; WAN Xiaoang. The Emotional Distractor Previewing Effect in Visual Search[J]. Acta Psychologica Sinica, 2014, 46(11): 1603-1612.
[5] REN Yanju; SUN Qi. Effects of Visuo-spatial Working Memory Loads on the Real-World Scene Search Performance[J]. Acta Psychologica Sinica, 2014, 46(11): 1613-1627.
[6] LIN Ou;WANG Zhengke;MENG Xiangzhi. Visual Perceptual Learning in Chinese Developmental Dyslexia[J]. Acta Psychologica Sinica, 2013, 45(7): 762-772.
[7] ZHANG Bao;HUANG Sai;QI Lu. Working Memory Representation Does Guide Visual Attention: Evidence from Eye Movements[J]. Acta Psychologica Sinica, 2013, 45(2): 139-148.
[8] ZHANG Qing,ZHANG Jie-Dong,HU Si-Yuan,LIU Jia. Ecological Stimuli Are Processed Automatically[J]. , 2011, 43(11): 1229-1238.
[9] LI Fu-Hong,CAO Bi-Hua,XIAO Feng,LI Hong. The Role of Inhibit Control in the Process of Rare Target Detection[J]. , 2011, 43(05): 509-518.
[10] LI Bin-Yin,XU Bai-Hua,CUI Xiang-Yu,SHENG Feng,LEI Jing-Yu. The Role of Iconic Memory in Visual Search under Dynamic Condition[J]. , 2010, 42(04): 485-495.
[11] ZHANG Wei,LIU Xiang-Ping,SONG Hong-Yan. The Influence of “Hot” Executive Function on the Verbal Working Memory of Attention Deficit Hyperactivity Disorder (ADHD) and Reading Disability (RD) Children[J]. , 2010, 42(03): 415-422.
[12] Ma Yanyun. Factors influencing Visual Search on Orientation Dimension[J]. , 2007, 39(02): 209-214.
[14] Gao Wenbin, Luo Yuejia, Wei Jinghan, Peng Xiaohu, Wei Xing ( Institute of Psychology, Chinese Academy of Sciences, Beijing 100101). ERP STUDY ON SCALE OF VISUAL SPATIAL ATTENTION WITH FIXED CUES[J]. , 2002, 34(05): 1-6.
[15] Xu Shulian, Wu Zhiping ,Wu Zhenyun, Sun Changhua(Institule of Psychology, Chinese Academy of Sciences, Beijing 100101). THE RELATIONSHIP BETWEEN PERSONALITY CHARACTERISTICS AND SOME COGNITIVE PERFORMANCES IN ADULTS[J]. , 2000, 32(03): 276-281.
Full text



Copyright © Acta Psychologica Sinica
Support by Beijing Magtech