Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Integration Mechanisms for Heading Perception

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Access this article

+ Tax (if applicable)
Add to Favorites

image of Seeing and Perceiving
For more content, see Multisensory Research and Spatial Vision.

Previous studies of heading perception suggest that human observers employ spatiotemporal pooling to accommodate noise in optic flow stimuli. Here, we investigated how spatial and temporal integration mechanisms are used for judgments of heading through a psychophysical experiment involving three different types of noise. Furthermore, we developed two ideal observer models to study the components of the spatial information used by observers when performing the heading task. In the psychophysical experiment, we applied three types of direction noise to optic flow stimuli to differentiate the involvement of spatial and temporal integration mechanisms. The results indicate that temporal integration mechanisms play a role in heading perception, though their contribution is weaker than that of the spatial integration mechanisms. To elucidate how observers process spatial information to extract heading from a noisy optic flow field, we compared psychophysical performance in response to random-walk direction noise with that of two ideal observer models (IOMs). One model relied on 2D screen-projected flow information (2D-IOM), while the other used environmental, i.e., 3D, flow information (3D-IOM). The results suggest that human observers compensate for the loss of information during the 2D retinal projection of the visual scene for modest amounts of noise. This suggests the likelihood of a 3D reconstruction during heading perception, which breaks down under extreme levels of noise.

Affiliations: 1: Brain and Vision Research Laboratory, Department of Biomedical Engineering, Boston University, 44 Cummington Street, Boston, MA 02215, USA; 2: Brain and Vision Research Laboratory, Department of Biomedical Engineering, Boston University, 44 Cummington Street, Boston, MA 02215, USA, Department of Biomedical Engineering, Marquette University, P.O. Box 1881, Milwaukee, WI 53201, USA; 3: Brain and Vision Research Laboratory, Department of Biomedical Engineering, Boston University, 44 Cummington Street, Boston, MA 02215, USA, Department of Neurology, Harvard Medical School, 75 Francis Street, Boston, MA 02215, USA. Email: vaina@bu.edu

Loading

Full text loading...

/content/journals/10.1163/187847510x503605
Loading

Data & Media loading...

http://brill.metastore.ingenta.com/content/journals/10.1163/187847510x503605
Loading

Article metrics loading...

/content/journals/10.1163/187847510x503605
2010-07-01
2016-10-01

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation