Category: News

Courses by our PhDs

This summer term, two of our PhD candidates are holding Psychology classes at Hamburg University. Phyllis Mania is giving an in in-depth course on Biological Psychology for first years. Topics range from sensorimotor processing to many others, like comparative and social neuroscience.

Janina Brandes is teaching research practice to 2nd year Bachelor students in “Empirisch Experimentelles Praktikum”. Students are encouraged to design their own experiments, acquire and analyze data independently, and write a short report. The results will also be presented in a poster session by the end of the term. The date will be announced soon, everyone is invited to join!

 

We are hiring a PostDoc – apply by June 15, 2015

Funding

The position is funded by a project in the Research Collaborative (Sonderforschungsbereich, SFB) 936, “Multi-site communication in the brain”. The principle investigators of the project are Tobias Heed (Hamburg), Peter König (Osnabrück), and Brigitte Röder (Hamburg). The position is attached to the Emmy Noether Group “Reach and Touch” headed by Tobias Heed, within the Biopsychology department of the University of Hamburg.

The SFB is funded for 4 years. The earliest starting date is July 1, 2015, but a later starting date is possible. The position will end on June 30, 2019, independent of the starting date. 

The SFB consists of 18 projects. They all investigate some aspect of brain connectivity. Methods courses, talks by international guests, and yearly retreats are organized on a regular basis, providing an interesting, interdisciplinary research environment.

 

Project

The advertised position is in the project “Tactile-visual interactions for saccade planning during free viewing and their modulation by TMS”. The project investigates how saccade planning is influenced by tactile input. The project’s focus is on connectivity between unisensory and multisensory brain regions, measured with EEG, and on the effects of disturbing these networks with TMS.

The PostDoc’s tasks are the planning, data acquisition, analysis, and publication of behavioral, EEG, and combined EEG/TMS studies.

The SFB is located in Hamburg (commonly known as the most beautiful city of the world…). Some initial training for the PostDoc is planned to take place in Peter König’s lab in Osnabrück. There will be close collaboration between the Hamburg and Osnabrück labs for the project, as well as collaboration with other EEG/MEG projects of the SFB.

 

Who we are looking for

must-have:

  • You have a university degree in a relevant subject, plus doctorate (PhD).
  • You have experience with the planning, data acquisition, analysis, and publication of EEG or MEG studies with frequency analysis, preferably with the software fieldtrip.
  • You have experience with programming to create experiments (e.g. in Matlab, Presentation, Python).
  • You like to work and integrate with a team, but you can work very independently. Applications of both new and advanced PostDocs are welcome.

good-to-have:

  • You have experience with the analysis of EEG/MEG connectivity
  • eye tracking
  • and/or TMS.

 

What next?

If you have any questions, please do not hesitate to contact Tobias Heed (tobias.heed@uni-hamburg.de).

You can find the official advertisement here.

Applications should be sent by email, in one single pdf file, to tobias.heed@uni-hamburg.de by June 15, 2015.

New paper: Reference frames for tactile attention are reflected in alpha and beta EEG activity

New paper in press: Jonathan Schubert, Verena N. Buchholz, Julia Föcker, Andreas Engel, Brigitte Röder, & Tobias Heed: Oscillatory activity reflects differential use of spatial reference frames by sighted and blind individuals in tactile attention, to appear in NeuroImage.

Researchers have recognized rhythmic variations of the EEG signal, termed oscillatory activity,  as an important indicator of cognitive activity. In this paper, we explored how oscillatory activity of two frequency bands, alpha and beta, relates to the spatial processing of touch.

Alpha and beta activity change when a person is paying attention to a defined area of space. For example, when you expect the traffic light on your left to turn green soon, then your right hemisphere, responsible for the left visual field, will show reduced alpha and beta activity. But what about touch: imagine you hear a mosquito flying around in your dark room; it seems to be to your left. You concentrate on your left arm, afraid that the mosquito will touch down and suck your blood. Do alpha and beta activity respond in the same way as when you waited for the traffic light to change?

When we cross our right hand over to the left side, then our brain codes this hand in two ways: as a right body part, but also as a body part in the left space. We tested whether alpha and beta activity change according to the “body” code, or according to the “space” code. We found that the two frequency bands behave differently. Alpha activity changes according to the “space” code, and beta activity changes according to the “body” code — at least when your visual system has developed normally.

We also analyzed how the frequency bands behave in people who were born blind. In this group, alpha and beta activity both changed according to the “body” code. This difference between sighted and blind humans in neural activity fits well with differences between the two groups in behavior that we have investigated in earlier studies. Sighted humans automatically perceive touch in space. People who were born blind often rely entirely on a “body” code. Still, both groups appeared to use the same brain regions to direct their attention to one or the other hand. This means that these brain regions use a different spatial code for attention to touch depending on whether you have sight or not — an impressive demonstration of how the different sensory systems influence one another.

New paper: Irrelevant tactile stimulation biases visual exploration in external coordinates

New paper in press: José Ossandón, Peter König, and Tobias Heed: Irrelevant tactile stimulation biases visual exploration in external coordinates, to appear in Scientific Reports.

Humans make rapid eye movements, so-called saccades, about 2-3 times a second. Peter König and his group have studied how we choose the next place to look at. It turns out that a number of criteria come together for this decision:  “low-level” visual features like contrast and brightness, “high-level” visual features like interestingness, and general preferences for one or the other side of space all influence where the eyes want to go next.

But imagine yourself walking through a forest. When you hear some birds singing, or you hear some cracking in the underwood, you will direct your gaze into a general direction, up in the trees or towards the nearby bushes). Similarly, you might feel that the ground is soft and uneven, making you scan the path in front of you. All of these are examples of information gathered by senses other than vision.

In this new paper, we brought together Peter König’s interest in eye movement choices and Tobias’s interest in processing touch in space. Together with Peter’s PhD student José Ossandón, we investigated how touch influences where we look next. Our participants viewed natural images, and from time to time received a tap on one of their hands. We told participants that the taps were entirely irrelevant (and really, they never had to do anything with them). Nevertheless, when we tapped the left hand, then the next few eye movements more often landed on the left side of the viewed scene, than when we tapped the right hand. Our participants did not look to where we had applied the tap on the hand; the taps rather made them orient in the general direction of the touch, towards the left or right side of space.

We then asked our participants to cross their hands: now their left hand was on the right, and the right hand was on the left. In this situation, tapping the left hand made participants look more often to the right – that is, towards the side of space in which the tapped hand laid. In other words, eye movements were biased towards where the tap was in space, not towards where it was on the body. This finding is a nice example of how our brain recodes information from the different senses, here touch (see our recent review paper on tactile remapping for more information), and uses it to guide behavior, for example exploratory eye movements.

The collaboration between José, Peter, and Tobias emerges from the Research Collaborative SFB 936, to which both Peter and Tobias have contributed research projects.

New paper: Effects of movement on tactile localization in sighted and blind humans

New paper in press: Tobias Heed, Johanna Möller, and Brigitte Röder: Movement induces the use of external spatial coordinates for tactile localization in congenitally blind humans, to appear in Multisensory Research.

We and others have often found that people who were born blind process touch differently than people who can see. Sighted people automatically compute where a touch is in space, that is, they combine the location of the touch in the skin, and where the touched body part currently is. Congenitally blind people don’t seem to do the same. Instead, they mostly rely just on the location of the touch on the skin, unless they really have to derive the location in space. Given these differences, the visual system is apparently important for how we perceive touch.

We did indeed find that blind humans code touch differently while they move than while they are still. And, as we had suspected, they seem to derive a location for touch in space in this situation. Yet, this spatial location appears to be of much higher relevance to sighted than to blind people.

Therefore, our results confirm that whether you can see or not critically influences the way you perceive touch. However, how we code touch is also affected by movement.

New paper: Tactile remapping: from coordinate transformation to integration in sensorimotor processing

New paper by Heed, T., Buchholz, V.N., Engel, A.K., and Röder, B. (in press). Tactile remapping: from coordinate transformation to integration in sensorimotor processing, to appear in Trends in Cognitive Sciences.

Tactile remapping, the transformation of where a touch is on the skin into a location in space, has often been conceptualized as a serial process: first, we perceive where the touch is on the skin; then, we compute the location in space; then, we use that location ever after. In this opinion paper, we argue instead that tactile localization is better viewed not as a serial, but as an integrative process. We propose that the brain determines where a touch was by using all kinds of information, including the skin location and the newly computed location in space. This view has emerged over recent years in our lab, most clearly in Steph Badde’s work on tactile remapping.

This view brings up the question how the different pieces of information are brought together, and how they are integrated. In our paper, we suggest that the analysis of oscillatory brain activity and large-scale brain connectivity may be ideally suited to investigate these kinds of questions.

My favorite part of the paper is a sideline we explore in Box 1. We briefly introduce the idea of sensorimotor contingencies as the basis for the transformation between different spatial formats (like skin location and space). According to this view, the brain might learn the relationship between the different formats by learning the statistical distributions of the sensory and motor signals that occur together. To make this a bit more graspable, imagine you feel, for the first time in your life, an itch on your nose (skin location). To direct your arm to scratch the nose, you could make random movements until you finally reach the nose and relieve the itch. Over time, you would realize that relief of the nose itch happens when your arm is in a certain location, an event that you can relate with seeing your hand near your face and with the proprioceptive signals that go along with this arm posture. Traditionally, researchers have assumed that the brain has to calculate the location of the nose in space, and that this spatial location can then be used to guide the hand. In the sensorimotor contingency approach, no such explicit derivation of the nose’s spatial position is necessary: you simply re-create all the sensory signals that you have learned will co-occur when a nose itch ends, by initiating the appropriate motor commands that lead to this end state. Given that I have investigated transformations between different spatial formats for several years, the prospect that they might not exist at all was a bit daunting at first. However, on second thought, I realized that the sensorimotor contingency approach fits perfectly with the integration idea we promote in our opinion paper.

The paper emerges from the cooperation of several projects within the Research Collaborative “Multi-site communication in the brain”.

MSc Thesis prize

Karima Chakroun was awarded 3rd place for the best MSc thesis of the Psychology department in 2014. Karima conducted her thesis in our lab under Tobias’ supervision. Her topic was ‘„Bauchgefühl“ für den Raum – Taktil-visuelle Kongruenzeffekte bei der räumlichen Wahrnehmung des eigenen Körpers’. Congratulations!