ISSN 0439-755X
CN 11-1911/B

Acta Psychologica Sinica ›› 2016, Vol. 48 ›› Issue (2): 153-162.

### Interaction between spatial domain and spatial reference frame in deaf and hearing populations

WANG Aijun1,2; SHEN Lu3; CHI Yingying2; LIU Xiaole1; CHEN Qi3; ZHANG Ming1

1. (1 Department of Psychology, Soochow University, Suzhou 215123, China) (2 School of Psychology, Northeast Normal University, Changchun 130024, China) (3 School of Psychology, South China Normal University, Guangzhou 510631, China)
• Received:2015-02-02 Published:2016-02-25 Online:2016-02-25
• Contact: ZHANG Ming, E-mail: psyzm@suda.edu.cn

Abstract:

Spatial location of an object can be represented in the human brain based on allocentric and egocentric reference frames. The perception/action hypothesis of the ventral and dorsal visual streams proposed that egocentric representations are readily encoded in the dorsal stream as sensorimotor representations, and allocentric representations are encoded in the ventral stream as perceptual representations. In addition, the dorsal visual stream, which transforms visual information into sensorimotor representations is implicated in near-space processing, and the ventral stream, which transforms visual information into perceptual representations is involved in the conscious perception of objects in far space. It has been well documented that how the spatial domain and spatial reference frame works, separately. However, it remains poorly understood how the spatial domain interacts with spatial reference frame, especially for the deaf populations. In present study, we synchronously operated different spatial domains (near vs. far) and different spatial reference frames (egocentric vs. allocentric) to investigate the potential interactions in deaf and hearing populations.
By asking 17 congenitally deaf participants and 17 hearing participants to perform allocentric and egocentric judgment tasks on the same stimulus set in near and far spaces, respectively, forming a 2 by 2 factorial design in Experiment 1. The stimuli in each trial contained two 3-D objects: a fork on the top of a round orange plate. In near space, stimuli were presented on a screen with the eye-to-screen distance as 50 cm, and in far space, stimuli were presented via a projector on a screen with the eye-to-screen distance as 226 cm. Retinotopic sizes of the objects and visual angles of the egocentric and allocentric distances were both matched for near and far spaces. To test the sense of balance was impaired in the deaf group of the present experiment, we asked both groups to perform a open loop pointing test in Experiment 2.
The main results showed that there were different interaction patterns between deaf and hearing participants. The interaction between spatial domain and spatial reference frame was significant in deaf participants. RTs to targets in allocentric judgment tasks were significantly longer than RTs to targets in egocentric judgment tasks when targets appeared in the far space, t(16) = 2.13, p < 0.05, d = 0.75, and in the near space, t(16) = 3.93, p = 0.001, d = 1.38. The interaction between spatial domain and spatial reference frame was also significant in hearing participants. RTs to targets in allocentric judgment tasks were significantly shorter than RTs to targets in egocentric judgment tasks when targets appeared in the far space, t(16) = 3.64, p < 0.005, d = 0.64. However, there was no significant interaction between spatial domain and spatial reference frame when targets appeared in the near space, t(16) = 1.55, p > 0.05. In addition, the results of Experiment 2 indicated that the difference between deaf and hearing participants was not due to the sense of balance was impaired in the deaf participants.
We concluded that the interaction between spatial domain and spatial reference frame was altered after early deafness, resulting in generating the different interaction patterns between deaf and hearing participants.