New paper: Irrelevant tactile stimulation biases visual exploration in external coordinates

New paper in press: José Ossandón, Peter König, and Tobias Heed: Irrelevant tactile stimulation biases visual exploration in external coordinates, to appear in Scientific Reports.

Humans make rapid eye movements, so-called saccades, about 2-3 times a second. Peter König and his group have studied how we choose the next place to look at. It turns out that a number of criteria come together for this decision:  “low-level” visual features like contrast and brightness, “high-level” visual features like interestingness, and general preferences for one or the other side of space all influence where the eyes want to go next.

But imagine yourself walking through a forest. When you hear some birds singing, or you hear some cracking in the underwood, you will direct your gaze into a general direction, up in the trees or towards the nearby bushes). Similarly, you might feel that the ground is soft and uneven, making you scan the path in front of you. All of these are examples of information gathered by senses other than vision.

In this new paper, we brought together Peter König’s interest in eye movement choices and Tobias’s interest in processing touch in space. Together with Peter’s PhD student José Ossandón, we investigated how touch influences where we look next. Our participants viewed natural images, and from time to time received a tap on one of their hands. We told participants that the taps were entirely irrelevant (and really, they never had to do anything with them). Nevertheless, when we tapped the left hand, then the next few eye movements more often landed on the left side of the viewed scene, than when we tapped the right hand. Our participants did not look to where we had applied the tap on the hand; the taps rather made them orient in the general direction of the touch, towards the left or right side of space.

We then asked our participants to cross their hands: now their left hand was on the right, and the right hand was on the left. In this situation, tapping the left hand made participants look more often to the right – that is, towards the side of space in which the tapped hand laid. In other words, eye movements were biased towards where the tap was in space, not towards where it was on the body. This finding is a nice example of how our brain recodes information from the different senses, here touch (see our recent review paper on tactile remapping for more information), and uses it to guide behavior, for example exploratory eye movements.

The collaboration between José, Peter, and Tobias emerges from the Research Collaborative SFB 936, to which both Peter and Tobias have contributed research projects.