New paper: Tactile remapping: from coordinate transformation to integration in sensorimotor processing

New paper by Heed, T., Buchholz, V.N., Engel, A.K., and Röder, B. (in press). Tactile remapping: from coordinate transformation to integration in sensorimotor processing, to appear in Trends in Cognitive Sciences.

Tactile remapping, the transformation of where a touch is on the skin into a location in space, has often been conceptualized as a serial process: first, we perceive where the touch is on the skin; then, we compute the location in space; then, we use that location ever after. In this opinion paper, we argue instead that tactile localization is better viewed not as a serial, but as an integrative process. We propose that the brain determines where a touch was by using all kinds of information, including the skin location and the newly computed location in space. This view has emerged over recent years in our lab, most clearly in Steph Badde’s work on tactile remapping.

This view brings up the question how the different pieces of information are brought together, and how they are integrated. In our paper, we suggest that the analysis of oscillatory brain activity and large-scale brain connectivity may be ideally suited to investigate these kinds of questions.

My favorite part of the paper is a sideline we explore in Box 1. We briefly introduce the idea of sensorimotor contingencies as the basis for the transformation between different spatial formats (like skin location and space). According to this view, the brain might learn the relationship between the different formats by learning the statistical distributions of the sensory and motor signals that occur together. To make this a bit more graspable, imagine you feel, for the first time in your life, an itch on your nose (skin location). To direct your arm to scratch the nose, you could make random movements until you finally reach the nose and relieve the itch. Over time, you would realize that relief of the nose itch happens when your arm is in a certain location, an event that you can relate with seeing your hand near your face and with the proprioceptive signals that go along with this arm posture. Traditionally, researchers have assumed that the brain has to calculate the location of the nose in space, and that this spatial location can then be used to guide the hand. In the sensorimotor contingency approach, no such explicit derivation of the nose’s spatial position is necessary: you simply re-create all the sensory signals that you have learned will co-occur when a nose itch ends, by initiating the appropriate motor commands that lead to this end state. Given that I have investigated transformations between different spatial formats for several years, the prospect that they might not exist at all was a bit daunting at first. However, on second thought, I realized that the sensorimotor contingency approach fits perfectly with the integration idea we promote in our opinion paper.

The paper emerges from the cooperation of several projects within the Research Collaborative “Multi-site communication in the brain”.