Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Full Access Single-object consistency facilitates multisensory pair learning: Evidence for unitization

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Single-object consistency facilitates multisensory pair learning: Evidence for unitization

  • HTML
  • PDF
Add to Favorites
You must be logged in to use this functionality

image of Seeing and Perceiving
For more content, see Multisensory Research and Spatial Vision.

Learning about objects often involves associating multisensory properties such as the taste and smell of a food or the face and voice of a person. Here, we report a novel phenomenon in associative learning in which pairs of multisensory attributes that are consistent with deriving from a single object are learned better than pairs that are not. In Experiment 1, we found superior learning of arbitrary pairs of human faces and voices when they were gender-congruent — and thus were consistent with belonging to a single personal identity — compared with gender-incongruent pairs. In Experiment 2, we found a similar advantage when the learned pair consisted of species-congruent animal pictures and vocalizations vs. species-incongruent pairs. In Experiment 3, we found that temporal synchrony — which provides a highly reliable alternative cue that properties derive from a single object — improved performance specifically for the incongruent pairs. Together, these findings demonstrate a novel principle in associative learning in which multisensory pairs that are consistent with having a single object as their source are learned more easily than multisensory pairs that are not. These results suggest that unitizing multisensory properties into a single representation may be a specialized learning mechanism.

Affiliations: 1: Florida Atlantic University, US

Learning about objects often involves associating multisensory properties such as the taste and smell of a food or the face and voice of a person. Here, we report a novel phenomenon in associative learning in which pairs of multisensory attributes that are consistent with deriving from a single object are learned better than pairs that are not. In Experiment 1, we found superior learning of arbitrary pairs of human faces and voices when they were gender-congruent — and thus were consistent with belonging to a single personal identity — compared with gender-incongruent pairs. In Experiment 2, we found a similar advantage when the learned pair consisted of species-congruent animal pictures and vocalizations vs. species-incongruent pairs. In Experiment 3, we found that temporal synchrony — which provides a highly reliable alternative cue that properties derive from a single object — improved performance specifically for the incongruent pairs. Together, these findings demonstrate a novel principle in associative learning in which multisensory pairs that are consistent with having a single object as their source are learned more easily than multisensory pairs that are not. These results suggest that unitizing multisensory properties into a single representation may be a specialized learning mechanism.

Loading

Full text loading...

/deliver/18784763/25/0/18784763_025_00_S010_text.html;jsessionid=-V2QAuWoC__GImMFotymiIaE.x-brill-live-02?itemId=/content/journals/10.1163/187847612x646343&mimeType=html&fmt=ahah
/content/journals/10.1163/187847612x646343
Loading

Data & Media loading...

http://brill.metastore.ingenta.com/content/journals/10.1163/187847612x646343
Loading
Loading

Article metrics loading...

/content/journals/10.1163/187847612x646343
2012-01-01
2016-12-09

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation