ISSN 1671-3710
CN 11-4766/R
主办:中国科学院心理研究所
出版:科学出版社

Advances in Psychological Science ›› 2023, Vol. 31 ›› Issue (2): 173-195.doi: 10.3724/SP.J.1042.2023.00173

• Research Method • Previous Articles     Next Articles

Exploring the neural representation patterns in event-related EEG/MEG signals: The methods based on classification decoding and representation similarity analysis

CHEN Xinwen, LI Hongjie, DING Yulong()   

  1. Key Laboratory of Brain, Cognition and Education Sciences (South China Normal University), Ministry of Education; School of Psychology, South China Normal University; Center for Studies of Psychological Application, South China Normal University; Guangdong Key Laboratory of Mental Health and Cognitive Science, Guangzhou 510631, China
  • Received:2021-12-16 Online:2023-02-15 Published:2022-11-10
  • Contact: DING Yulong E-mail:dingyulong@m.scnu.edu.cn

Abstract:

It is generally considered that the human brain will generate distinct neural representations corresponding to different mental processes. Exploring the differences of neural representations under various mental activities is one of the core issues in cognitive neuroscience. During recent decades, researchers have used different neuroimaging techniques to record brain activities involved in complex cognitive processes from the perspective of temporal or spatial measurement. Among these techniques, the non-invasive EEG/MEG with temporal resolution of millisecond has become a popular one to study the time courses of various cognitive activities. Due to the characteristics of EEG/MEG data (e.g., low S/N), in order to obtain relatively reliable results, traditional EEG/MEG studies mainly focused on the neural responses after group averaging, paying less attention to individual differences. Such method assumes that, for each subject, the amplitudes and directions of ERPs/ERMFs and their topographic maps in a specific time window of interest exhibit a consistent pattern under an experimental condition. In the case of poor consistency, the neural responses across subjects may cancel each other to a great extent after group averaging, which makes it difficult to get a reasonable interpretation.

In recent years, researchers have introduced two techniques commonly used in fMRI studies, classification algorithms in machine learning (i.e., classification-based decoding) and representation similarity analysis, into the EEG/MEG data analysis. These two new techniques can overcome the shortcomings of traditional EEG/MEG data analysis based on averaging of voltage/magnetic flux density waveforms by taking individual differences into account, which could be used to reveal the coding of neural representation at individual level and provide a new idea to explore how the brain encodes specific neural representations dynamically. In the study of ERPs/ERMFs, classification-based decoding and representation similarity analysis can be used to explore not only the neural mechanisms that show consistent patterns along time among individuals, but also those that are significantly different across individuals but keep stable for a given individual. Thus, these two techniques are able to reveal specific neural representation patterns and even identify "brain fingerprints" at individual levels. Based on different methodological theories, these two techniques provide novel ways for EEG/MEG studies to compare representational differences of cognitive processes across time windows, tasks, modalities, and groups. Firstly, we systematically introduced the principles and operational processes of classification-based decoding and representation similarity analysis, together with a comparison with those traditional analysis methods of EEG/MEG. Then, the EEG/MEG studies to date using these two techniques are reviewed. Finally, some possible future research directions with regard to these two techniques are proposed.

Key words: electroencephalography/magnetoencephalography, neural representation, machine learning/classification- based decoding, representation similarity analysis

CLC Number: