When we touch ourselves, the pressure appears weaker compared to when someone else touches us, an effect known as sensory attenuation. Sensory attenuation is spatially tuned and does only occur if the positions of the touching and the touched body-party spatially coincide. Here, we ask about the contribution of visual or proprioceptive signals to determine self-touch. By using a 3D arm model in a virtual reality environment, we dissociated the visual from the proprioceptive arm signal. When a virtual arm was visible indicating self-touch, we found that sensory attenuation generalized across different locations. When no virtual arm was visible, we found sensory attenuation to be strongest when subjects pointed to the position where they felt their arm to be located. We conclude that the spatial tuning of tactile attenuation depends on which signal determines the occurrence of self-touch. When observers can see their hand, the visual signal dominates the proprioceptive determining self-touch in a single visual snapshot. When only the proprioceptive signal is available, the positions of the touching and the touched body-part must be separately estimated and subsequently compared if they overlap in anatomical space.
Keywords: Proprioception; Self-touch; Sensory attenuation; Spatial tuning; Vision.
© 2024. The Author(s).