Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Full Access Dissociating pitch and loudness interactions between audition and touch

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Dissociating pitch and loudness interactions between audition and touch

  • PDF
  • HTML
Add to Favorites
You must be logged in to use this functionality

image of Multisensory Research
For more content, see Seeing and Perceiving and Spatial Vision.

In this talk I review a series of studies focused on characterizing the relationship between audition and touch. We perceive the frequency and intensity of environmental oscillations (sounds and vibrations) using both modalities. Audio-tactile interactions in the frequency domain are frequency-specific and bi-directional: Interaction patterns support the existence of shared (supramodal) frequency representations. In contrast, audio-tactile interactions in the intensity domain reveal a separate set of integration rules. Thus, a pair of auditory and tactile inputs combines differently depending on the perceptual task (i.e., pitch vs. loudness discrimination). That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two processes rely on separate neural mechanisms. Other perceptual processes combining auditory and tactile signals, like stimulus detection or spatial localization, may also adhere to unique integration rules that reflect dissociable neural mechanisms. These results underscore the complexity and specificity of multisensory interactions.

Affiliations: 1: Johns Hopkins University, Baltimore, MD, USA

In this talk I review a series of studies focused on characterizing the relationship between audition and touch. We perceive the frequency and intensity of environmental oscillations (sounds and vibrations) using both modalities. Audio-tactile interactions in the frequency domain are frequency-specific and bi-directional: Interaction patterns support the existence of shared (supramodal) frequency representations. In contrast, audio-tactile interactions in the intensity domain reveal a separate set of integration rules. Thus, a pair of auditory and tactile inputs combines differently depending on the perceptual task (i.e., pitch vs. loudness discrimination). That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two processes rely on separate neural mechanisms. Other perceptual processes combining auditory and tactile signals, like stimulus detection or spatial localization, may also adhere to unique integration rules that reflect dissociable neural mechanisms. These results underscore the complexity and specificity of multisensory interactions.

Loading

Full text loading...

/deliver/22134808/26/10/22134808_026_00_S82_text.html;jsessionid=png3OF-KuCwtLS-CregENYQr.x-brill-live-02?itemId=/content/journals/10.1163/22134808-000s0082&mimeType=html&fmt=ahah
/content/journals/10.1163/22134808-000s0082
Loading

Data & Media loading...

http://brill.metastore.ingenta.com/content/journals/10.1163/22134808-000s0082
Loading
Loading

Article metrics loading...

/content/journals/10.1163/22134808-000s0082
2013-05-16
2016-12-06

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation