by Georg F. Meyer, Sophie M. Wuerger, Florian Röhrbein, Christoph Zetzsche
Abstract:
It is well known that the detection thresholds for stationary auditory and visual signals are lower if the signals are presented bimodally rather than unimodally, provided the signals coincide in time and space. Recent work on auditory–visual motion detection suggests that the facilitation seen for stationary signals is not seen for motion signals. We investigate the conditions under which motion perception also benefits from the integration of auditory and visual signals. We show that the integration of cross-modal local motion signals that are matched in position and speed is consistent with thresholds predicted by a neural summation model. If the signals are presented in different hemi-fields, move in different directions, or both, then behavioural thresholds are predicted by a probability-summation model. We conclude that cross-modal signals have to be co-localised and co-incident for effective motion integration. We also argue that facilitation is only seen if the signals contain all localisation cues that would be produced by physical objects.
Reference:
Low-level integration of auditory and visual motion signals requires spatial co-localisation (Georg F. Meyer, Sophie M. Wuerger, Florian Röhrbein, Christoph Zetzsche), In Exp Brain Res, Springer Science + Business Media, volume 166, 2005.
Bibtex Entry:
@Article{Meyer2005,
author = {Georg F. Meyer and Sophie M. Wuerger and Florian Röhrbein and Christoph Zetzsche},
title = {Low-level integration of auditory and visual motion signals requires spatial co-localisation},
journal = {Exp Brain Res},
year = {2005},
volume = {166},
number = {3-4},
pages = {538--547},
month = {sep},
abstract = {It is well known that the detection thresholds for stationary auditory and visual signals are lower if the signals are presented bimodally rather than unimodally, provided the signals coincide in time and space. Recent work on auditory–visual motion detection suggests that the facilitation seen for stationary signals is not seen for motion signals. We investigate the conditions under which motion perception also benefits from the integration of auditory and visual signals. We show that the integration of cross-modal local motion signals that are matched in position and speed is consistent with thresholds predicted by a neural summation model. If the signals are presented in different hemi-fields, move in different directions, or both, then behavioural thresholds are predicted by a probability-summation model. We conclude that cross-modal signals have to be co-localised and co-incident for effective motion integration. We also argue that facilitation is only seen if the signals contain all localisation cues that would be produced by physical objects.},
doi = {10.1007/s00221-005-2394-7},
publisher = {Springer Science + Business Media},
url = {10.1007/s00221-005-2394-7">http://dx.doi.org/10.1007/s00221-005-2394-7},
}