%A SHI Wendian;LI Xiujun;WANG Wei;YAN Wenhua %T Comparison of Implicit Learning Effect between Multisensory and Unisensory %0 Journal Article %D 2013 %J Acta Psychologica Sinica %R 10.3724/SP.J.1041.2013.01313 %P 1313-1323 %V 45 %N 12 %U {https://journal.psych.ac.cn/xlxb/CN/abstract/article_2074.shtml} %8 2013-12-25 %X

There is a controversy between two models (modality-dependent versus abstract representations) concerning knowledge gaining in the cognitive psychology in recent years. Some studies showed that participants gained their knowledge base on the legal regularities (Barbey & Wilson, 2003), and gained their implicit learning not only across letter sets, but also across sense modalities (Tunney & Altmann, 2001; Kirkham, Slemmer, & Johnson, 2002). Transfer effects are explained by proposing that the learning is based on abstract knowledge, that is, knowledge is not directly tied to the surface features or sensory instantiation of the stimuli (Pena, Bonatti, Nespor, & Mehler, 2002). On the contrary, other studies showed different results that supposedly grounded in modality-specific sensorimotor mechanisms demonstrating implicit learning is not only sensitive to stimulus-specific features (e.g., Chang, Knowlton, 2004) but also to modality-specific features (e.g., Barsalou, Simmons, Barbey, & Wilson, 2003; Conway & Christiansen, 2005, 2006, 2009; Emberson, 2011). Therefore it needs more exploration about the root of implicit learning which employs a central mechanism or multiple modality-specific mechanisms. Previous researches mainly focus on comparison of single modality, but the sensory environment is seldom limited to a single modality or input source (Stein & Stanford, 2008), thus it is possible that implicit learning may use simultaneously both auditory and visual modalities. The objective of current research is to explore to what extent multimodal input sources are processed independently. There were 169 college students took part into three experiments. Artificial Grammar Learning task was used. In Experiment 1, visual and auditory implicit learning effects were measured respectively, and the result of Experiment 1 provided a baseline learning rate for comparison in subsequent experiments. In Experiment 2, audiovisual sequences were presented simultaneously with the same grammar rules. In Experiment 3, audiovisual sequences were presented simultaneously with the different grammar rules. Results showed that: (1) there was significant implicit learning effect both for visual and auditory. (2) there was marginally significant implicit learning effect on visual and auditory when audiovisual sequences were presented simultaneously with the same grammar rules; and there were no significant differences between unisensory and multisensory. (3) There were significant implicit learning effects both of visual and auditory when audiovisual sequences were presented simultaneously with the different grammar rules, and there were no significant differences between unisensory and multisensory. One conclusion of current research is that multisensory has almost the same implicit learning effect as unisensory. Participants are able to track simultaneously two sets of sequential regularities regardless of the similarity of grammar rules, which indicates learners possess multisensory implicit learning ability. Multistream statistical learning is processed independently for each modality which perhaps indicates the involvement of multiple learning subsystems. The research result supports implicit learning’s modality-specific theory and challenges abstract representation’s theory.