New paper on the anchors of external reference frames in touch (Plos One)

New paper:
Heed T., Backhaus J., Röder B., Badde S. (2016).
Disentangling the External Reference Frames Relevant to Tactile Localization.
PLoS ONE 11(7):e0158829. doi:10.1371/journal.pone.0158829
[ open access pdf ] [ data & scripts ]

In this paper, we’ve published work begun by Jenny Backhaus during her time in our lab several years ago. It took us a while to get the paper ready, because we used Generalized Linear Mixed Modeling, a statistical approach that proved difficult for the data of the experimental paradigm we had used here, temporal order judgments.
We’ve started to publish Open Access, and to provide the data and scripts to our papers. So if you would like to try out different statistics than the ones we used here, run wild.

The research question

One of the central themes of the Reach & Touch Lab is that the brain automatically places tactile events in space. Given that touch is perceived through sensors in the skin, projecting touch into space requires computations that integrate skin location of the touch with the current posture of one’s body.
But space is a relative concept: the brain could code touch relative to many anchors. For instance, it could code every touch relative to the eyes. This would be useful, because the location of touch could then easily be integrated with locations of the visual system. But there are many alternative “anchors” relative to which the brain could code touch. Suggestions have been the head, the torso, and even landmarks outside the body.

What we show

We manipulated body posture in a way that allows us to disentangle different possible anchors that may be relevant in touch. We present three main findings. First, the eyes appear to be an anchor for tactile space. Second, head and torso did not play a role in tactile coding in our experiments. Finally, however, an eye anchor alone cannot explain our participants’ behavior; this result suggests that other spatial codes (just not head and torso-centered ones) play a role in tactile processing, in line with previous results we’ve published (see our recent review). We suspect that an important code is an object-centered one, in which spatial coordinates would depend on the involved body parts, as well as their posture relative to one another.

Why the results are important

The reference frames used in touch – that is, the anchors relative to which space is coded – have been investigated with a number of different paradigms. Our present paper connects a popular paradigm, the so-called temporal order judgment, with a large body of literature that has used other paradigms, by showing that the eyes are consistently important for tactile spatial coding. The temporal order judgment is a very flexible paradigm, making it an attractive choice for tactile research. It is therefore important to know that it renders results that generalize to other paradigms.
Furthermore, our finding that an object-centered reference frame may be particularly important in tactile coding challenges us to develop experiments that will directly test this hypothesis.

Leave a Reply

Your email address will not be published. Required fields are marked *