Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Full Access Human temporal coordination of visual and auditory events in virtual reality

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Human temporal coordination of visual and auditory events in virtual reality

  • PDF
  • HTML
Add to Favorites
You must be logged in to use this functionality

image of Seeing and Perceiving
For more content, see Multisensory Research and Spatial Vision.

Since the speed of sound is much slower than light, we sometimes hear a sound later than an accompanying light event (e.g., thunder and lightning at a far distance). However, Sugita and Suzuki (2003) reported that our brain coordinates a sound and its accompanying light to be perceived simultaneously within 20 m distance. Thus, the light accompanied with physically delayed sound is perceived simultaneously with the sound in near field. We aimed to test if this sound–light coordination occurs in a virtual-reality environment and investigate effects of binocular disparity and motion parallax. Six naive participants observed visual stimuli on a 120-inch screen in a darkroom and heard auditory stimuli from a headphone. A ball was presented in a textured corridor and its distance from the participant was varied from 3–20 m. The ball changed to be in red before or after a short (10 ms) white noise (time difference: −120, −60, −30, 0, +30, +60, +120 ms), and participants judged temporal order of the color-change and the sound. We varied visual depth cues (binocular disparity and motion parallax) in the virtual-reality environment, and measured the physical delay at which visual and auditory events were perceived simultaneously. In terms of the results, we did not find sound–light coordination without binocular disparity or motion parallax, but found it with both cues. These results suggest that binocular disparity and motion parallax are effective for sound–light coordination in virtual-reality environment, and richness of depth cues are important for the coordination.

Affiliations: 1: Department of Computer Science and Engineering, Toyohashi University of Technology, JP

Since the speed of sound is much slower than light, we sometimes hear a sound later than an accompanying light event (e.g., thunder and lightning at a far distance). However, Sugita and Suzuki (2003) reported that our brain coordinates a sound and its accompanying light to be perceived simultaneously within 20 m distance. Thus, the light accompanied with physically delayed sound is perceived simultaneously with the sound in near field. We aimed to test if this sound–light coordination occurs in a virtual-reality environment and investigate effects of binocular disparity and motion parallax. Six naive participants observed visual stimuli on a 120-inch screen in a darkroom and heard auditory stimuli from a headphone. A ball was presented in a textured corridor and its distance from the participant was varied from 3–20 m. The ball changed to be in red before or after a short (10 ms) white noise (time difference: −120, −60, −30, 0, +30, +60, +120 ms), and participants judged temporal order of the color-change and the sound. We varied visual depth cues (binocular disparity and motion parallax) in the virtual-reality environment, and measured the physical delay at which visual and auditory events were perceived simultaneously. In terms of the results, we did not find sound–light coordination without binocular disparity or motion parallax, but found it with both cues. These results suggest that binocular disparity and motion parallax are effective for sound–light coordination in virtual-reality environment, and richness of depth cues are important for the coordination.

Loading

Full text loading...

/deliver/18784763/25/0/18784763_025_00_S028_text.html;jsessionid=vo_ibfBUK7dxtqNSUUP8pR6Q.x-brill-live-02?itemId=/content/journals/10.1163/187847612x646532&mimeType=html&fmt=ahah
/content/journals/10.1163/187847612x646532
Loading

Data & Media loading...

1. Sugita Y. , Suzuki Y. ( 2003). "Audiovisual perception: Implicit estimation of sound-arrival time", Nature Vol 421, 911. http://dx.doi.org/10.1038/421911a
http://brill.metastore.ingenta.com/content/journals/10.1163/187847612x646532
Loading
Loading

Article metrics loading...

/content/journals/10.1163/187847612x646532
2012-01-01
2016-12-02

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation