Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Full Access Combining fiber tracking and functional brain imaging for revealing brain networks involved in auditory–visual integration in humans

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Combining fiber tracking and functional brain imaging for revealing brain networks involved in auditory–visual integration in humans

  • PDF
  • HTML
Add to Favorites
You must be logged in to use this functionality

For more content, see Multisensory Research and Spatial Vision.

Previous functional magnetic resonance imaging (MRI) found various brain areas in the temporal and occipital lobe involved in integrating auditory and visual object information. Fiber tracking based on diffusion-weighted MRI suggested neuroanatomical connections between auditory cortex and sub-regions of the temporal and occipital lobe. However, the relationship between functional activity and white-matter tracks remained unclear. Here, we combined probabilistic tracking and functional MRI in order to reveal the structural connections related to auditory–visual object perception. Ten healthy people were examined by diffusion-weighted and functional MRI. During functional examinations they viewed either movies of lip or body movements, listened to corresponding sounds (phonological sounds or body action sounds), or a combination of both. We found that phonological sounds elicited stronger activity in the lateral superior temporal gyrus (STG) than body action sounds. Body movements elicited stronger activity in the lateral occipital cortex than lip movements. Functional activity in the phonological STG region and the lateral occipital body area were mutually modulated (sub-additive) by combined auditory–visual stimulation. Moreover, bimodal stimuli engaged a region in the posterior superior temporal sulcus (STS). Probabilistic tracking revealed white-matter tracks between the auditory cortex and sub-regions of the STS (anterior and posterior) and occipital cortex. The posterior STS region was also found to be relevant for auditory–visual object perception. The anterior STS region showed connections to the phonological STG area and to the lateral occipital body area. Our findings suggest that multisensory networks in the temporal lobe are best revealed by combining functional and structural measures.

Affiliations: 1: 1Institute of Psychology, University of Regensburg, DE; 2: 2Experimental and Clinical Neurosciences Programme, University of Regensburg, DE; 3: 3University of Liverpool, GB

Previous functional magnetic resonance imaging (MRI) found various brain areas in the temporal and occipital lobe involved in integrating auditory and visual object information. Fiber tracking based on diffusion-weighted MRI suggested neuroanatomical connections between auditory cortex and sub-regions of the temporal and occipital lobe. However, the relationship between functional activity and white-matter tracks remained unclear. Here, we combined probabilistic tracking and functional MRI in order to reveal the structural connections related to auditory–visual object perception. Ten healthy people were examined by diffusion-weighted and functional MRI. During functional examinations they viewed either movies of lip or body movements, listened to corresponding sounds (phonological sounds or body action sounds), or a combination of both. We found that phonological sounds elicited stronger activity in the lateral superior temporal gyrus (STG) than body action sounds. Body movements elicited stronger activity in the lateral occipital cortex than lip movements. Functional activity in the phonological STG region and the lateral occipital body area were mutually modulated (sub-additive) by combined auditory–visual stimulation. Moreover, bimodal stimuli engaged a region in the posterior superior temporal sulcus (STS). Probabilistic tracking revealed white-matter tracks between the auditory cortex and sub-regions of the STS (anterior and posterior) and occipital cortex. The posterior STS region was also found to be relevant for auditory–visual object perception. The anterior STS region showed connections to the phonological STG area and to the lateral occipital body area. Our findings suggest that multisensory networks in the temporal lobe are best revealed by combining functional and structural measures.

Loading

Full text loading...

/deliver/18784763/25/0/18784763_025_00_S004_text.html;jsessionid=Pa_pVwRo29kZIml0Qhx6pw-b.x-brill-live-03?itemId=/content/journals/10.1163/187847612x646280&mimeType=html&fmt=ahah
/content/journals/10.1163/187847612x646280
Loading

Data & Media loading...

http://brill.metastore.ingenta.com/content/journals/10.1163/187847612x646280
Loading
Loading

Article metrics loading...

/content/journals/10.1163/187847612x646280
2012-01-01
2016-12-08

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation