ISSN 1671-3710
CN 11-4766/R
主办:中国科学院心理研究所
出版:科学出版社

心理科学进展 ›› 2023, Vol. 31 ›› Issue (suppl.): 129-129.

• 视觉学习与可塑性 • 上一篇    下一篇

Representation- and Task-based Plasticity Decides Perceptual Learning and its Specificity and Transfer: A Computational Model

Xiao Liua, Muyang Lyub, Cong Yua, Si Wua   

  1. aSchool of Psychology and Cognitive Sciences, Peking University;
    bSchool of Artificial Intelligence, Beijing Normal University
  • 出版日期:2023-08-26 发布日期:2023-09-08

Representation- and Task-based Plasticity Decides Perceptual Learning and its Specificity and Transfer: A Computational Model

Xiao Liua, Muyang Lyub, Cong Yua, Si Wua   

  1. aSchool of Psychology and Cognitive Sciences, Peking University;
    bSchool of Artificial Intelligence, Beijing Normal University
  • Online:2023-08-26 Published:2023-09-08

Abstract: PURPOSE: Perceptual learning improves sensory discrimination with practice, reflecting plasticity in the brain. Although specificity has been regarded as a hallmark of perceptual learning, in that learned improvement cannot maintain when the task condition (e.g., location or orientation) alters, new training paradigms such as double training can render originally specific perceptual learning completely transferrable. In this study we aimed to build a unified neural computational model to explain learning specificity and transfer, in order to better understand the neural mechanisms underlying perceptual learning.
METHODS: We propose a new computational framework built on the following simple assumptions. First, there are two types of plasticity: task-based plasticity (general learning for decision making) and representation-based plasticity (specific learning for extracting features). Second, perceptual learning is by default transferrable (task-based plasticity dominates), but conventional training procedure induces overlearning (feature-based plasticity dominates) that makes learning specific. Third, double training removes the constraints by further exposure of new stimulus features with the transfer stimulus conditions, so that learning can transfer to these features.
RESULTS: Our model successfully replicates several perceptual learning outcomes in a Vernier learning task. With a small number of stimulus repetitions, task-based plasticity dominates perceptual learning and learning shows transferability. As the training progresses with more repetitive trials, feature-based plasticity gradually increases, focusing primarily on repeatedly occurring features and ignoring others, resulting in specificity (Jeter et al., 2009). Double training introduces new repeatedly presented features (e.g., location or orientation), activating feature-based plasticity with the transfer condition to achieve complete learning transfer to new stimulus location or orientation (Xiao et al., 2008; Zhang et al., 2010). Analyzing the network's neuronal activities reveals that the task plasticity module extracts stimulus-condition invariant information, whereas the feature plasticity module enhances the feature processing by improving signal-to-noise ratio. The balance between task-based learning and feature-based learning is crucial for successful learning and its specificity and transfer.
CONCLUSIONS: This model provides new thinking in interpreting perceptual learning, especially double training and resulting learning transfer. Further work is necessary to explain learning transfer among physically distinct stimuli, which suggests that perceptual learning may also operate at a conceptual level.

Key words: perceptual learning, specificity, transfer, plasticity