Multisensory homunculi align

Posted by Bayle Shanks at 3:26 AM EST

Read on for a talk abstract describing aligned visual and tactile homunculi in parietal cortex.

MAPPING MULTISENSORY REPRESENTATIONS OF PERIPERSONAL SPACE

Ruey-Song Huang

Swartz Center for Computational Neuroscience, Institute for Neural Computation, and Department of Cognitive Science

http://sccn.ucsd.edu/~rshuang/

This talk will present our recent progress in mapping multisensory representations of peripersonal space using fMRI, with topics covering both technical developments and scientific findings. Recently, we have developed wearable techniques for high-density and/or wide-range tactile stimulation in the MRI scanner. Sixty-four channels (expandable to 128) of computer-controlled air puffs can be delivered via plastic tubes/nozzles embedded in the air suit, including the face mask, turtleneck, gloves, and pants. The wearable techniques open the possibilities of presenting more complex tactile stimuli with programmable spatial-temporal patterns on the body surface, e.g. 2-D tactile display or tactile apparent motion.

Multiple two-condition block-design scans revealed a high-level somatotopic homunculus consisting of the parietal face, lip, finger, and shoulder areas in the superior parietal lobe. Retinotopic mapping using phase-encoded design and wide-field visual stimuli (masked videos or looming objects) further revealed aligned visual-tactile maps in the same areas. Tactile mapping revealed a high-level homunculus consisting of the parietal face, lip, finger, and shoulder areas in superior parietal lobule. Visual mapping revealed an aligned visual homunculus in the same areas. A region of lower visual field representation in the post-central sulcus partially overlaps with the parietal finger area, which is anterior and lateral to the parietal face/lip areas. Another region of lower visual field representation, superior and medial to the parietal face area, partially overlaps with the parietal shoulder area. However, regions of upper visual field representation were restricted to the parietal face area. We suggest that aligned multisensory homunculi may play important roles in combining visual and tactile information to facilitate movements in the peripersonal space (e.g., eating involves hand-to-mouth coordination in the lower visual field).

One Response to “Multisensory homunculi align”

  1. Bill Says:

    Funny how philosophers of consciousness try to exclude homunculi for logical reasons even as we find more and more of them empirically…

Leave a Reply