Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Full Access Cross-modal associations between vision, touch and audition influence visual search through top-down attention not bottom-up capture

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Cross-modal associations between vision, touch and audition influence visual search through top-down attention not bottom-up capture

  • HTML
  • PDF
Add to Favorites
You must be logged in to use this functionality

image of Multisensory Research
For more content, see Seeing and Perceiving and Spatial Vision.

A recent study (Guzman-Martinez et al., 2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal mapping automatically guided attention to specific spatial frequencies. We replicated the reported matching relationship and also performed matching between tactile and visual spatial frequency. We then used the visual search paradigm to investigate whether auditory or tactile cues can guide attention to matched visual spatial frequencies. Participants were presented with a search display containing multiple Gabors, all with different spatial frequencies. When the auditory or tactile cue was informative, improved search efficiency occurred for some matched spatial frequencies, with the specificity of the effect being greater for touch than audition. However, when uninformative neither auditory and tactile cues produced any effect on visual search performance. Furthermore, when informative, unmatched auditory cues (shifted substantially from the reported match, but still matched in relative position) improved search performance. Taken together, these findings suggest that although auditory and tactile cues can influence visual selection of a matched spatial frequency, the effects are due to top-down attentional control rather than automatic attentional capture derived from low-level mapping.

Affiliations: 1: University of Sydney, Australia

A recent study (Guzman-Martinez et al., 2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal mapping automatically guided attention to specific spatial frequencies. We replicated the reported matching relationship and also performed matching between tactile and visual spatial frequency. We then used the visual search paradigm to investigate whether auditory or tactile cues can guide attention to matched visual spatial frequencies. Participants were presented with a search display containing multiple Gabors, all with different spatial frequencies. When the auditory or tactile cue was informative, improved search efficiency occurred for some matched spatial frequencies, with the specificity of the effect being greater for touch than audition. However, when uninformative neither auditory and tactile cues produced any effect on visual search performance. Furthermore, when informative, unmatched auditory cues (shifted substantially from the reported match, but still matched in relative position) improved search performance. Taken together, these findings suggest that although auditory and tactile cues can influence visual selection of a matched spatial frequency, the effects are due to top-down attentional control rather than automatic attentional capture derived from low-level mapping.

Loading

Full text loading...

/deliver/22134808/26/10/22134808_026_00_S142_text.html;jsessionid=WaK9J4z-qxxmot-ESgBxkDTN.x-brill-live-02?itemId=/content/journals/10.1163/22134808-000s0142&mimeType=html&fmt=ahah
/content/journals/10.1163/22134808-000s0142
Loading

Data & Media loading...

http://brill.metastore.ingenta.com/content/journals/10.1163/22134808-000s0142
Loading
Loading

Article metrics loading...

/content/journals/10.1163/22134808-000s0142
2013-05-16
2016-12-08

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation