Cookies Policy

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Segregation and Integration of Cortical Information Processing Underlying Cross-Modal Perception

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Access this article

+ Tax (if applicable)
Add to Favorites
You must be logged in to use this functionality

image of Multisensory Research
For more content, see Seeing and Perceiving and Spatial Vision.

Visual cues from the speaker’s face influence the perception of speech. An example of this influence is demonstrated by the McGurk-effect where illusory (cross-modal) sounds are perceived following presentation of incongruent audio–visual (AV) stimuli. Previous studies report the engagement of specific cortical modules that are spatially distributed during cross-modal perception. However, the limits of the underlying representational space and the cortical network mechanisms remain unclear. In this combined psychophysical and electroencephalography (EEG) study, the participants reported their perception while listening to a set of synchronous and asynchronous incongruent AV stimuli. We identified the neural representation of subjective cross-modal perception at different organizational levels — at specific locations in sensor space and at the level of the large-scale brain network estimated from between-sensor interactions. We identified an enhanced positivity in the event-related potential peak around 300 ms following stimulus onset associated with cross-modal perception. At the spectral level, cross-modal perception involved an overall decrease in power at the frontal and temporal regions at multiple frequency bands and at all AV lags, along with an increased power at the occipital scalp region for synchronous AV stimuli. At the level of large-scale neuronal networks, enhanced functional connectivity at the gamma band involving frontal regions serves as a marker of AV integration. Thus, we report in one single study that segregation of information processing at individual brain locations and integration of information over candidate brain networks underlie multisensory speech perception.

Affiliations: 1: 1Cognitive Brain Lab, National Brain Research Centre, NH 8, Manesar, Gurgaon 122051, India ; 2: 2Centre of Behavioural and Cognitive Sciences, University of Allahabad, Allahabad 211002, India

*To whom correspondence should be addressed. E-mail:

Full text loading...


Data & Media loading...

1. Albright T. D. (2012). "On the perception of probable things: neural substrates of associative memory, imagery, and perception", Neuron Vol 74, 227245. [Crossref]
2. Allman B. L., Keniston L. P., Meredith M. A. (2009). "Adult deafness induces somatosensory conversion of ferret auditory cortex", Proc. Natl Acad. Sci. USA Vol 106, 59255930. [Crossref]
3. Balazs S., Kermanshahi K., Binder H., Rattay F., Bodis-Wollner I. (2015). "Gamma-band modulation and coherence in the EEG by involuntary eye movements in patients in unresponsive wakefulness syndrome", Clin. EEG Neurosci. Vol 47, 196206. [Crossref]
4. Bastiaansen M., Hagoort P. (2006). "Oscillatory neuronal dynamics during language comprehension", Progr. Brain Res. Vol 159, 179196. [Crossref]
5. Bizley J. K., King A. J. (2012). "What can multisensory processing tell us about the functional organization of auditory cortex?", in: The Neural Bases of Multisensory Processes, Murray M. M., Wallace M. T. (Eds), pp.  3148. CRC Press/Taylor and Francis, Boca Raton, FL, USA.
6. Bressler S. L., Kelso J. A. S. (2001). "Cortical coordination dynamics and cognition", Trends Cogn. Sci. Vol 5, 2636. [Crossref]
7. Bressler S. L., Richter C. G. (2014). "Interareal oscillatory synchronization in top-down neocortical processing", Curr. Opin. Neurobiol. Vol 31C, 6266.
8. Callan D. E., Jones J. A., Munhall K., Callan A. M., Kroos C., Vatikiotis-Bateson E. (2003). "Neural processes underlying perceptual enhancement by visual speech gestures", Neuroreport Vol 14(17), 22132218. DOI:10.1097/01.wnr.0000095492.38740.8f. [Crossref]
9. Calvert G. A., Thesen T. (2004). "Multisensory integration: methodological approaches and emerging principles in the human brain", J. Physiol. Paris Vol 98, 191205. [Crossref]
10. Corbetta M., Shulman G. L. (2002). "Control of goal-directed and stimulus-driven attention in the brain", Nat. Rev. Neurosci. Vol 3, 215229. [Crossref]
11. Engel A. K., Fries P. (2010). "Beta-band oscillations — signalling the status quo?" Curr. Opin. Neurobiol. Vol 20, 156165. [Crossref]
12. Green K. P., Kuhl P. K., Meltzoff A. N., Stevens E. B. (1991). "Integrating speech information across talkers, gender, and sensory modality: female faces and male voices in the McGurk effect", Percept. Psychophys. Vol 50, 524536. [Crossref]
13. Guggisberg A. G., Honma S. M., Findlay A. M., Dalal S. S., Kirsch H. E., Berger M. S., Nagarajan S. S. (2008). "Mapping functional connectivity in patients with brain lesions", Ann. Neurol. Vol 63, 193203. [Crossref]
14. Hanslmayr S., Gross J., Klimesch W., Shapiro K. L. (2011). "The role of α oscillations in temporal attention", Brain Res. Rev. Vol 67, 331343. [Crossref]
15. Hasson U., Skipper J. I., Nusbaum H. C., Small S. L. (2007). "Abstract coding of audiovisual speech: beyond sensory representation", Neuron Vol 56, 11161126. [Crossref]
16. Helfer K. S. (1997). "Auditory and auditory–visual perception of clear and conversational speech", J. Speech Lang. Hear. Res. Vol 40, 432443. [Crossref]
17. Herrmann C. S., Knight R. T. (2001). "Mechanisms of human attention: event-related potentials and oscillations", Neurosci. Biobehav. Rev. Vol 25, 465476. [Crossref]
18. Hipp J. F., Engel A. K., Siegel M. (2011). "Oscillatory synchronization in large-scale cortical networks predicts perception", Neuron Vol 69, 387396. [Crossref]
19. Horwitz B. (2005). "Integrating neuroscientific data across spatiotemporal scales", C. R. Biol. Vol 328, 109118. [Crossref]
20. Jain A., Bansal R., Kumar A., Singh K. D. (2015). "A comparative study of visual and auditory reaction times on the basis of gender and physical activity levels of medical first year students", Int. J. Appl. Basic Med. Res. Vol 5, 124127. [Crossref]
21. Jones J. A., Callan D. E. (2003). "Brain activity during audiovisual speech perception: an fMRI study of the McGurk effect", Neuroreport Vol 14, 11291133. [Crossref]
22. Kaiser J. (2004). "Hearing lips: gamma-band activity during audiovisual speech perception", Cereb. Cortex Vol 15, 646653. [Crossref]
23. Kaiser J., Lutzenberger W. (2005). "Human gamma-band activity: a window to cognitive processing", Neuroreport Vol 16, 207211. [Crossref]
24. Kaiser J., Hertrich I., Ackermann H., Lutzenberger W. (2006). "Gamma-band activity over early sensory areas predicts detection of changes in audiovisual speech stimuli", NeuroImage Vol 30, 13761382. [Crossref]
25. Keil J., Muller N., Ihssen N., Weisz N. (2012). "On the variability of the McGurk effect: audiovisual integration depends on prestimulus brain states", Cereb. Cortex Vol 22, 221231. [Crossref]
26. Kelly S. P., Dockree P., Reilly R. B., Robertson I. H. (2003). EEG alpha power and coherence time courses in a sustained attention task, in: First International IEEE EMBS Conference on Neural Engineering, 2003. Conference Proceedings, pp. 83–86.
27. Klimesch W. (1999). "EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis", Brain Res. Rev. Vol 29, 169195. [Crossref]
28. Klimesch W. (2012). "Controlled access to stored information", Trends Cogn. Sci. Vol 16, 606617. [Crossref]
29. Kumar G. V., Halder T., Jaiswal A. K., Mukherjee A., Roy D., Banerjee A. (2016). "Large scale functional brain networks underlying temporal integration of audio–visual speech perception: an EEG study", Front. Psychol. Vol 7, 1558. DOI:10.3389/fpsyg.2016.01558.
30. Luria A. R. (1995). Higher Cortical Functions in Man. Springer, Boston, MA, USA. [Crossref]
31. Maris E., Schoffelen J.-M., Fries P. (2007). "Nonparametric statistical testing of coherence differences", J. Neurosci. Meth. Vol 163, 161175. [Crossref]
32. Massaro D. W. (1989). "A fuzzy-logical model of categorization behavior", in: Human Information Processing: Measures, Mechanisms, and Models, Vickers D., Smith P. L. (Eds), pp. 367379. North-Holland, Amsterdam, The Netherlands.
33. McGurk H., MacDonald J. (1976). "Hearing lips and seeing voices", Nature Vol 264, 691811. [Crossref]
34. McIntosh A. R. (2004). "Contexts and catalysts: a resolution of the localization and integration of function in the brain", Neuroinformatics Vol 2, 175182. [Crossref]
35. Munhall K. G., Gribble P., Sacco L., Ward M. (1996). "Temporal constraints on the McGurk effect", Percept. Psychophys. Vol 58, 351362. [Crossref]
36. Murray R. F., Bennett P. J., Sekuler A. B. (2002). "Optimal methods for calculating classification images: weighted sums", J. Vis. Vol 2, 79104. DOI:10.1167/2.1.6.
37. Nath A. R., Beauchamp M. S. (2011). "Dynamic changes in superior temporal sulcus connectivity during perception of noisy audiovisual speech", J. Neurosci. Vol 31, 17041714. [Crossref]
38. Nath A. R., Beauchamp M. S. (2012). "A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion", Neuroimage Vol 59, 781787. [Crossref]
39. Nolte G., Bai O., Wheaton L., Mari Z., Vorbach S., Hallett M. (2004). "Identifying true brain interaction from EEG data using the imaginary part of coherency", Clin. Neurophysiol. Vol 115, 22922307. [Crossref]
40. Nyhus E., Curran T. (2010). "Functional role of gamma and theta oscillations in episodic memory", Neurosci. Biobehav. Rev. Vol 34, 10231035. [Crossref]
41. Payne L., Guillory S., Sekuler R. (2013). "Attention-modulated alpha-band oscillations protect against intrusion of irrelevant information", J. Cogn. Neurosci. Vol 25, 14631476. [Crossref]
42. Pitts M. A., Padwal J., Fennelly D., Martínez A., Hillyard S. A. (2014). "Gamma band activity and the P3 reflect post-perceptual processes, not visual awareness", NeuroImage Vol 101, 337350. [Crossref]
43. Pulvermüller F., Lutzenberger W., Preissl H., Birbaumer N. (1995). "Spectral responses in the gamma-band: physiological signs of higher cognitive processes?" Neuroreport Vol 6, 20592064. [Crossref]
44. Railo H., Koivisto M., Revonsuo A. (2011). "Tracking the processes behind conscious perception: a review of event-related potential correlates of visual consciousness", Conscious. Cogn. Vol 20, 972983. [Crossref]
45. Rutiku R., Martin M., Bachmann T., Aru J. (2015). "Does the P300 reflect conscious perception or its consequences?" Neuroscience Vol 298, 180189. [Crossref]
46. Saint-Amour D., De Sanctis P., Molholm S., Ritter W., Foxe J. J. (2007). "Seeing voices: high-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion", Neuropsychologia Vol 45, 587597. [Crossref]
47. Sauseng P., Conci M., Wild B., Geyer T. (2015). "Predictive coding in visual search as revealed by cross-frequency EEG phase synchronization", Front. Psychol. Vol 6, 1655. DOI:10.3389/fpsyg.2015.01655. [Crossref]
48. Sekiyama K., Kanno I., Miura S., Sugita Y. (2003). "Auditory–visual speech perception examined by fMRI and PET", Neurosci. Res. Vol 47, 277287. [Crossref]
49. Shelton J., Kumar G. P. (2010). "Comparison between auditory and visual simple reaction times", Neurosci. Med. Vol 1, 3032. [Crossref]
50. Sigala R., Haufe S., Roy D., Dinse H. R., Ritter P. (2014). "The role of alpha-rhythm states in perceptual learning: insights from experiments and computational models", Front. Comput. Neurosci. Vol 8, 36. DOI:10.3389/fncom.2014.00036. [Crossref]
51. Simson R., Vaughan H. G., Ritter W. (1977). "The scalp topography of potentials in auditory and visual Go/NoGo tasks", Electroencephalogr. Clin. Neurophysiol. Vol 43, 864875. [Crossref]
52. Skipper J. I., Van Wassenhove V., Nusbaum H. C., Small S. L. (2007). "Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception", Cereb. Cortex Vol 17, 23872399. [Crossref]
53. Stevenson R. A., Altieri N. A., Kim S., Pisoni D. B., James T. W. (2010). "Neural processing of asynchronous audiovisual speech perception", NeuroImage Vol 49, 33083318. [Crossref]
54. Sumby W. H., Pollack I. (1954). "Visual contribution to speech intelligibility in noise", J. Acoust. Soc. Am. Vol 26, 212215. [Crossref]
55. Talsma D. (2015). "Predictive coding and multisensory integration: an attentional account of the multisensory mind", Front. Integr. Neurosci. Vol 9, 19. DOI:10.3389/fnint.2015.00019. [Crossref]
56. Thakur B., Mukherjee A., Sen A., Banerjee A. (2016). "A dynamical framework to relate perceptual variability with multisensory information processing", Sci. Rep. Vol 6, 31280. DOI:10.1038/srep31280.
57. Van Wassenhove V., Grant K. W., Poeppel D. (2005). "Visual speech speeds up the neural processing of auditory speech", Proc. Natl Acad. Sci. USA Vol 102, 11811186. [Crossref]
58. Van Wassenhove V., Grant K. W., Poeppel D. (2007). "Temporal window of integration in auditory–visual speech perception", Neuropsychologia Vol 45, 598607. [Crossref]
59. VanRullen R. (2016). "Perceptual cycles", Trends Cogn. Sci. Vol 20, 723735. [Crossref]
60. Wallace M. T., Meredith M. A., Stein B. E. (1993). "Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus", J. Neurophysiol. Vol 69, 17971809. [Crossref]
61. Wang X. J. (2010). "Neurophysiological and computational principles of cortical rhythms in cognition", Physiol. Rev. Vol 90, 11951268. [Crossref]

Article metrics loading...



Can't access your account?
  • Tools

  • Add to Favorites
  • Printable version
  • Email this page
  • Subscribe to ToC alert
  • Get permissions
  • Recommend to your library

    You must fill out fields marked with: *

    Librarian details
    Your details
    Why are you recommending this title?
    Select reason:
    Multisensory Research — Recommend this title to your library
  • Export citations
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation