New paper: Reference frames for tactile attention are reflected in alpha and beta EEG activity

New paper in press: Jonathan Schubert, Verena N. Buchholz, Julia Föcker, Andreas Engel, Brigitte Röder, & Tobias Heed: Oscillatory activity reflects differential use of spatial reference frames by sighted and blind individuals in tactile attention, to appear in NeuroImage.

Researchers have recognized rhythmic variations of the EEG signal, termed oscillatory activity,  as an important indicator of cognitive activity. In this paper, we explored how oscillatory activity of two frequency bands, alpha and beta, relates to the spatial processing of touch.

Alpha and beta activity change when a person is paying attention to a defined area of space. For example, when you expect the traffic light on your left to turn green soon, then your right hemisphere, responsible for the left visual field, will show reduced alpha and beta activity. But what about touch: imagine you hear a mosquito flying around in your dark room; it seems to be to your left. You concentrate on your left arm, afraid that the mosquito will touch down and suck your blood. Do alpha and beta activity respond in the same way as when you waited for the traffic light to change?

When we cross our right hand over to the left side, then our brain codes this hand in two ways: as a right body part, but also as a body part in the left space. We tested whether alpha and beta activity change according to the “body” code, or according to the “space” code. We found that the two frequency bands behave differently. Alpha activity changes according to the “space” code, and beta activity changes according to the “body” code — at least when your visual system has developed normally.

We also analyzed how the frequency bands behave in people who were born blind. In this group, alpha and beta activity both changed according to the “body” code. This difference between sighted and blind humans in neural activity fits well with differences between the two groups in behavior that we have investigated in earlier studies. Sighted humans automatically perceive touch in space. People who were born blind often rely entirely on a “body” code. Still, both groups appeared to use the same brain regions to direct their attention to one or the other hand. This means that these brain regions use a different spatial code for attention to touch depending on whether you have sight or not — an impressive demonstration of how the different sensory systems influence one another.