Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Full Access Audio-visual interactions in the perception of intention from actions

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Audio-visual interactions in the perception of intention from actions

  • PDF
  • HTML
Add to Favorites
You must be logged in to use this functionality

image of Multisensory Research
For more content, see Seeing and Perceiving and Spatial Vision.

Although humans can infer other people’s intentions from their visual actions (Blakemore and Decety, 2001), it is not well understood how auditory information can influence this process. We investigated whether auditory emotional information can influence the perceived intention of another from their visual body motion. Participants viewed a set of videos which presented point light displays (PLDs) of 10 actors (5 male) who were asked to portray a ‘hero’ or a ‘villain’ character. Based on a 2-AFC design, participants categorised each visual character as having ‘good’ or ‘bad’ intentions. Response accuracy and speed were recorded. Performance on visual-only trials exceeded chance performance suggesting that participants were efficient at judging intentions from PLDs. We then paired auditory vocal stimuli which were associated with either positive (happy) or negative (angry) emotions with each of the PLDs. The auditory stimuli were taken from Belin et al. (2008) and consisted of nonverbal bursts (‘ah’) recorded from 10 actors (5 male). Each vocalisation was randomly paired with a sex-matched PLD (60 PLD-voice combinations). We found that both the categorisation responses and the speed of those responses were affected by the inclusion of the auditory stimuli. Specifically, reaction times were facilitated when the auditory emotion (positive or negative) matched the perceived intentions (good or bad respectively) relative to unisensory conditions. Our findings suggest important interactions between audition and visual actions in perceiving intentions in others and are consistent with previous findings of audio-visual interactions in action-specific visual regions of the brain (e.g., Barraclough et al., 2005).

Affiliations: 1: 1School of Psychology and Institute of Neuroscience, Trinity College Dublin, Ireland; 2: 2Graphics, Vision and Visualisation Group, School of Computer Science and Statistics, Trinity College Dublin, Ireland

Although humans can infer other people’s intentions from their visual actions (Blakemore and Decety, 2001), it is not well understood how auditory information can influence this process. We investigated whether auditory emotional information can influence the perceived intention of another from their visual body motion. Participants viewed a set of videos which presented point light displays (PLDs) of 10 actors (5 male) who were asked to portray a ‘hero’ or a ‘villain’ character. Based on a 2-AFC design, participants categorised each visual character as having ‘good’ or ‘bad’ intentions. Response accuracy and speed were recorded. Performance on visual-only trials exceeded chance performance suggesting that participants were efficient at judging intentions from PLDs. We then paired auditory vocal stimuli which were associated with either positive (happy) or negative (angry) emotions with each of the PLDs. The auditory stimuli were taken from Belin et al. (2008) and consisted of nonverbal bursts (‘ah’) recorded from 10 actors (5 male). Each vocalisation was randomly paired with a sex-matched PLD (60 PLD-voice combinations). We found that both the categorisation responses and the speed of those responses were affected by the inclusion of the auditory stimuli. Specifically, reaction times were facilitated when the auditory emotion (positive or negative) matched the perceived intentions (good or bad respectively) relative to unisensory conditions. Our findings suggest important interactions between audition and visual actions in perceiving intentions in others and are consistent with previous findings of audio-visual interactions in action-specific visual regions of the brain (e.g., Barraclough et al., 2005).

Loading

Full text loading...

/deliver/22134808/26/10/22134808_026_00_S115_text.html;jsessionid=IPT9Tib6Cl3ytpAkij3rUGW5.x-brill-live-03?itemId=/content/journals/10.1163/22134808-000s0115&mimeType=html&fmt=ahah
/content/journals/10.1163/22134808-000s0115
Loading

Data & Media loading...

http://brill.metastore.ingenta.com/content/journals/10.1163/22134808-000s0115
Loading
Loading

Article metrics loading...

/content/journals/10.1163/22134808-000s0115
2013-05-16
2016-12-10

Sign-in

Can't access your account?
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation