What we are working on
Tactile-visual interactions for saccade planning during free viewing and their modulation by TMS
funding: Research Collaborative 936 (second funding period), project B1, German Research Foundation
principal investigators: Tobias Heed, Peter König, Brigitte Röder
project member: We are hiring a PostDoc! Apply by June 15, 2015
This project will investigate how different brain regions connect to merge space across different sensory systems. The project continues our work from the first funding period of the Research Collaborative.
We found that touch modulates where you will look next, even when this touch has no immediate relevance to you right now. Imagine yourself walking through a forest. When you hear birds sing, you might look towards the tree tops, even though what you hear are many birds, and you are not trying to see one in particular. The bird singing just makes you explore the tree tops in general. You may also feel that your feet are walking on soft ground, and that there are twigs and stones in your path. These tactile sensations might make you look ahead on your path.
We are curious about how the brain integrates these kinds of sensory information – what you hear and what you feel – to decide where to look. We combine behavioral research, EEG, and TMS to tackle these questions.
Sensorimotor processing and reference frame transformations in the human brain
funding: Emmy Noether Programme of the German Research Foundation
principal investigator: Tobias Heed
project members: Janina Brandes, Phyllis Mania
The main aim of this project is to connect two areas of research that have usually been investigated separately: perceiving touch and making reaching movements. When we perceive a touch, we first know where it was on the skin. But because our body parts move a lot, we have to consider our body posture if we want to act on the touch.
Imagine you are sitting in a park at a picnic. You feel something crawling on your left hand. To look what is crawling, and to swipe it off, it makes a big difference whether your left hand is behind your back because you are leaning on it, or whether you are holding a plate with it in front of you. Your movements towards the touch must be very different for these two situations. At the same time, when we perform actions, like swiping away the insect on our hand, we feel our movement and monitor whether we are achieving our Goal.
Thus, both for touch perception and for reaching, body posture is a central aspect. In this project, we investigate how body posture affects touch and reaching, each on its own. Ultimately, we are interested in how the two are coordinated, that is, how the brain plans and executes movements towards the own body after it has registered a touch.
The role of vision for shaping cortico-cortical interactions mediating sensorimotor transformations
When you feel a touch, then you not only know where it was on the skin, but you also know where the touch is in space. This is less trivial than it may sound, because our body parts are constantly in motion. Therefore, a touch to any given body part can be in many locations in space, depending on our body posture.
Whereas sighted humans automatically use the location in space, people who were born blind often do not. This difference has led to the idea that the way we process touch strongly depends on the visual system.
In this project, we challenge this conclusion and investigate an alternative idea. We test whether the way we process touch may also depend on our ability to move. Because sighted and blind People move just the same, the differences that researchers have observed between them in earlier research should be less prominent when we execute movements.
The project is part of the Research Collaborative ‘Multi-site communication in the brain’, a large collaborative project interested in the role of communication between different brain areas. Our project therefore also investigates whether brain communication differs between sighted and blind people when they feel touch.