ISSN 1671-3710
CN 11-4766/R
主办:中国科学院心理研究所
出版:科学出版社

Advances in Psychological Science ›› 2024, Vol. 32 ›› Issue (5): 790-799.doi: 10.3724/SP.J.1042.2024.00790

• Regular Articles • Previous Articles     Next Articles

The cross-modal integration process in facial attractiveness judgments

WANG Yuling, LU Xiaowei, WU Zongjie, LI Guogen, ZHANG Lin()   

  1. Department and Institute of Psychology, Ningbo University, Ningbo 315211, China
  • Received:2023-09-07 Online:2024-05-15 Published:2024-03-05
  • Contact: ZHANG Lin E-mail:zhanglin1@nbu.edu.cn

Abstract:

Facial attractiveness research has traditionally centered on visual cues, sidelining the contribution of non-visual information. This review, through a retrospective analysis, reveals that in the process of evaluating facial attractiveness, individuals not only depend on visual information but also consider auditory and olfactory cues. These diverse sensory inputs collectively participate in the judgment of facial attractiveness, as demonstrated. However, due to differences in the content and conveyance of auditory and olfactory stimuli, factors influencing their integration with visual facial cues may exhibit variations. In contrast to audio-visual integration, visual-olfactory integration may be more susceptible to the influence of familiarity.

Through a retrospective review, this study finds that cross-modal integration in facial attractiveness judgments largely aligns with general cross-modal integration processes, adhering to similar integration mechanisms. Sensory information and prior experiences play crucial roles in this process. Sensory inputs, acting as top-down stimuli, capture individual attention and consolidate diverse sensory information onto the target face. The standard face, formed based on perceptual experiences, serves as top-down prior knowledge, deepening connections between different sensory information and promoting integration. However, from a more nuanced perspective, the cross-modal integration process in facial attractiveness judgments also exhibits unique characteristics. Due to the inherently social and individual nature of facial attractiveness judgments, factors such as emotions, sensory thresholds, and familiarity exert significant influences on the cross-modal integration process of facial attractiveness judgments. However, the impact of these factors is limited in the general cross-modal integration processes.

In addition, this review integrates the Face-space Model and the Bayesian Causal Inference model, proposing a cross-modal integration process for facial attractiveness judgments. Facial attractiveness judgment is based on the deviation between the target face and a standard face. The formation of the standard face involves not only visual information but also other sensory modalities. Individuals, when exposed to multiple sensory inputs, naturally connect various information based on the standard face, achieving cross-modal integration. When individuals infer that different sensory cues originate from the same target face, they naturally integrate these cues in the brain, forming a unified perception of the target face for attractiveness judgment.

Based on existing research, this review suggests three future research directions. In the first aspect, current studies often focus on the pairwise integration of visual, olfactory, and auditory cues, neglecting the role of tactile cues. Future research should explore the integration of facial stimuli in a broader, multisensory environment, employing deep learning and machine learning techniques to analyze extensive multisensory data, aiming to construct a more comprehensive cross-modal integration process model. In the second aspect, concerning the cross-modal integration process in facial attractiveness judgments, current research has not unearthed reliable evidence for unconscious integration processing. Stimuli presented unconsciously do not necessarily reflect real-life situations. Additionally, the extent to which the visual system can guide and allocate attention based on unconscious perceptual cues and subsequent targets remains an unresolved question. Future research should pivot towards investigating the degree to which the integration process relies on conscious perception, rather than whether this process can occur in unconscious conditions.In the third aspect, considering the complexity and dynamism of cross-modal integration in facial attractiveness judgments, future research could employ EEG techniques to examine different stages of integration mechanisms during social interactions. Additionally, using more ecologically valid materials and environments could enhance our understanding of cross-modal integration in real-world social interactions, facilitating a more accurate and rapid comprehension of the external world and promoting social engagement.

In conclusion, this comprehensive review synthesizes existing knowledge and outlines promising avenues for future research in the realm of facial attractiveness and multisensory integration. By unraveling the intricate interplay of sensory modalities, this research aims to provide a deeper understanding of how individuals perceive and evaluate facial attractiveness, paving the way for advancements in facial attractiveness and cross-modal integration studies.

Key words: facial attractiveness, cross-modal integration, olfaction, audition, vision

CLC Number: