Why do we visually recognise an object we’ve only touched?

A team from the University of Geneva identifies a key brain region that abstracts spatial information from the senses, illuminating a fundamental pillar of intelligence.

A mouse stands between red and blue lights representing the lower and upper part
A mouse stands between red and blue lights representing the lower and upper parts of its surrounding space, where touch and vision converge. © Sami El-Boustani

The brain has a remarkable capacity for abstraction. For example, it allows us to recognise an object in complete darkness through touch alone, even if we have previously only identified it by sight. This ability to transfer learning --or sensory representations-- from one modality to another is considered a cornerstone of intelligence. It is found in many animals and even in some insects. However, the brain mechanisms behind it remain poorly understood.

The brain has a remarkable capacity for abstraction. For example, it allows us to recognise an object in complete darkness through touch alone, even if we have previously only identified it by sight. This ability to transfer learning --or sensory representations-- from one modality to another is considered a cornerstone of intelligence. It is found in many animals and even in some insects. However, the brain mechanisms behind it remain poorly understood.

Ces résultats ouvrent des pistes prometteuses en médecine et en intelligence artificielle.

Recent work on mice by a team from the University of Geneva has led to new advances. The researchers were able to pinpoint the areas of the cortex where tactile and visual information are combined. These regions are thought to play a central role in sensory generalisation. In particular, the rostro-lateral area (RL), located in the dorsal region of the cortex, appears to be essential for this cognitive ability.

Mice show generalisation too

To achieve this result, the scientists first trained mice to distinguish between tactile stimulation from above or below, perceived via their whiskers or ’’vibrissae’’. If the lower vibrissae were stimulated, the mice had to lick a tube that delivered a reward. If the top vibrissae were stimulated, nothing happened. ’’After a week, they had integrated the rule very well,’’ says Sami El-Boustani, assistant professor in the Department of Basic Neurosciences at the University of Geneva Faculty of Medicine, who led the study.

To test the rodents’ ability to generalise, the researchers then replaced the tactile stimuli with visual ones-a shadow moving across the field of vision from either above or below. ’’We then found that the mice adapted very well to this change of sensory modality and always responded to the stimulus coming from below. The new task was always performed correctly’’, explains Maëlle Guyoton, a postdoctoral researcher in the Department of Basic Neuroscience at the University of Geneva Faculty of Medicine, and co-first author of the study.

Promising applications

By mapping the cerebral activity of these mice at single-cell resolution, the team discovered the specific areas combining touch and vision, including the RL area. By inactivating it, they observed that the mice lost their ability to generalise, while remaining capable of learning and performing tasks using only one sense. Conversely, by optically stimulating the RL area, the scientists were able to induce generalisation.

’’The RL area is therefore a key region of the brain: it allows the mouse to understand that what it felt with its whiskers in the dark corresponds to what it now sees in full light,’’ explains Giulio Matteucci, postdoctoral researcher at the Department of Fundamental Neurosciences in the University of Geneva Faculty of Medicine, and co-first author of the study.

These findings open up promising avenues not only in medicine - where a better understanding of these circuits could inform research on sensory disorders - but also in artificial intelligence, where systems have to learn to integrate data from a variety of sources, whether text, image, or sound.

This research is published in
Nature Communications

DOI: 10.1038/s41467-025-59342-9