ISSN 1671-3710
CN 11-4766/R
主办:中国科学院心理研究所
出版:科学出版社

Advances in Psychological Science ›› 2025, Vol. 33 ›› Issue (11): 1837-1853.doi: 10.3724/SP.J.1042.2025.1837

• Conceptual Framework •     Next Articles

Micro-expression analysis for practical applications: From data acquisition to intelligent deployment

LI Jingting, ZHAO Lin, DONG Zizhao, WANG Su-Jing()   

  1. State Key Laboratory of Cognitive Science and Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China
    Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
  • Received:2025-03-06 Online:2025-11-15 Published:2025-09-19
  • Contact: WANG Su-Jing E-mail:wangsujing@psych.ac.cn

Abstract:

Micro-expressions, as facial cues that leak an individual's true emotions during attempts at concealment, hold significant promise for applications across diverse fields such as medical care, public safety, and national security. However, the practical deployment of intelligent micro-expression analysis is currently hampered by several critical issues. These include the scarcity of large-scale datasets, the suboptimal performance of analytical models on complex real-world samples, and the inherent data privacy and transmission limitations found in many application scenarios. These challenges collectively constrain the real-world implementation of this technology.

To advance the use of intelligent micro-expression analysis as a non-contact, imperceptible method for emotion monitoring in specific contexts, this research will undertake theoretical and technical investigations in the following areas. Faced with the dual challenges of scarce high-ecological-validity micro-expression data and a need for a deeper understanding of their behavioral and physiological mechanisms in specific application contexts, this study first builds upon a foundation of psychological research to investigate micro-expression mechanisms within interactive settings. We design effective experimental paradigms to elicit micro-expressions under varying conditions of ecological validity, including a "video-induced paradigm," an "intentional deception paradigm in an interactive context," an "active lying paradigm," and a "mock crime paradigm." By employing high-definition cameras, depth cameras, thermal imagers, and polygraphs, we will conduct comprehensive recordings of participants' facial expressions, body posture, and physiological changes. The objective is to construct a micro-expression database that is multi-modal, multi-view, multi-scenario, and possesses high ecological validity. Subsequently, to overcome the time and labor-intensive nature of manual coding, we will develop an auxiliary coding system based on a perifacial electromyography to efficiently build this large-scale database. Based on the collected data, we will then analyze the behavioral and physiological patterns of micro-expressions across different interactive situations.

In video data captured in realistic and complex environments, facial muscle movements resulting from head pose variations and speech-related mouth articulations are common confounding factors in micro-expression spotting. As brief and subtle facial movements, micro-expressions can be easily obscured by more pronounced head movements, which can distort their temporal and spatial features and lead to reduced accuracy and recall rates for spotting algorithms. This research develops efficient, plug-and-play processing algorithms to accurately distinguish and filter out facial muscle actions caused by head movement and speech, thereby ensuring the precise extraction of micro-expression features. Thereafter, to effectively address the prevalent small-sample-size problem in micro-expression analysis, this study proposes a self-supervised learning model tailored for the vertical domain of micro-expressions, building upon large-scale models. Leveraging advancements in large vision models, we will construct a two-stage downstream task structure within this specific domain. First, the model will learn the patterns of facial motion by learning the intensity of facial action units from a large volume of unlabeled expression data. Subsequently, the model will be fine-tuned on the micro-expression recognition task using a small amount of labeled data, thereby overcoming the limitations of small sample sizes. By harnessing the powerful image feature extraction capabilities of large models, our approach will progressively learn facial motion patterns and micro-expression features to enhance recognition performance.

Given that applications of intelligent micro-expression analysis often involve data security concerns, we will employ an asynchronous federated learning deployment strategy. This approach enables model training on data from diverse sources without directly sharing the raw data, thereby enhancing the model's generalization capabilities and accuracy. Federated learning not only ensures data security and privacy but also allows the model to learn from multi-source data to improve its performance. Furthermore, at the client level, the model's adaptability to small sample sizes and specific scenarios will be further enhanced through a combination of self-supervised and reinforcement learning. This architecture facilitates effective model training and updates while adapting to dynamic application environments. It will strongly promote the advancement of micro-expression analysis technology and provide a solid security foundation for its implementation in sensitive fields.

Through the interdisciplinary fusion of psychology, computer science, and other fields, this research aims to establish an application framework for intelligent micro-expression analysis built upon a solid theoretical foundation. Research addressing these needs will not only enhance the performance of micro-expression analysis technology but will also foster the development of non-contact, imperceptible mental state monitoring applications in real-world settings.

Key words: micro-expression intelligent analysis, affective computing, high ecological validity, practical application

CLC Number: