|

|

Integrating multiple sensory modalities during dyadic interactions drives self-other differentiation at the behavioral and electrocortical level
Interpersonal motor interactions represent a key setting for processing signals from multiple sensory channels simultaneously, possibly modulating cross-modal multisensory integration, that is a crucial perceptual mechanism where different sensory sources of information are combined into one single percept. Here we explored whether integrating sensorimotor signals while interacting with a partner can lead to shared sensorimotor representations, and possibly to a recalibration of individual's multisensory perception. In detail, we investigated whether engaging individuals in dyadic activities that utilized either single (e.g., visual or tactile/proprioceptive) or combined (e.g., visuo-tactile/proprioceptive) sensory modalities would impact the behavioral and electrocortical markers associated with interpersonal cross-modal integration. We show that interactions requiring the integration of multiple sensory modalities lead to higher interpersonal differentiation resulting in reduced interpersonal cross-modal integration in a subsequent spatial detection task and alter its distributed neural representations. Specifically, the neural patterns elicited by interpersonal visuo-tactile stimuli vary based on the sensory nature of the previous interpersonal interaction, with the one involving multiple sensory modalities resulting in improved performance of a neural classifier. These findings suggest new avenues for sensorimotor approaches in social neuroscience, emphasizing the malleability of self-other representations based on the nature of interpersonal interactions.
(Читать комментарии) (Добавить комментарий)
|
|