Cookies Policy
X

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

I accept this policy

Find out more here

Seeing the Way: the Role of Vision in Conversation Turn Exchange Perception

No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
The full text of this article is not currently available.

Brill’s MyBook program is exclusively available on BrillOnline Books and Journals. Students and scholars affiliated with an institution that has purchased a Brill E-Book on the BrillOnline platform automatically have access to the MyBook option for the title(s) acquired by the Library. Brill MyBook is a print-on-demand paperback copy which is sold at a favorably uniform low price.

Access this article

+ Tax (if applicable)
Add to Favorites
You must be logged in to use this functionality

image of Multisensory Research
For more content, see Seeing and Perceiving and Spatial Vision.

During conversations, we engage in turn-taking behaviour that proceeds back and forth effortlessly as we communicate. In any given day, we participate in numerous face-to-face interactions that contain social cues from our partner and we interpret these cues to rapidly identify whether it is appropriate to speak. Although the benefit provided by visual cues has been well established in several areas of communication, the use of visual information to make turn-taking decisions during conversation is unclear. Here we conducted two experiments to investigate the role of visual information in identifying conversational turn exchanges. We presented clips containing single utterances spoken by single individuals engaged in a natural conversation with another. These utterances were from either right before a turn exchange (i.e., when the current talker would finish and the other would begin) or were utterances where the same talker would continue speaking. In Experiment 1, participants were presented audiovisual, auditory-only and visual-only versions of our stimuli and identified whether a turn exchange would occur or not. We demonstrated that although participants could identify turn exchanges with unimodal information alone, they performed best in the audiovisual modality. In Experiment 2, we presented participants audiovisual turn exchanges where the talker, the listener or both were visible. We showed that participants suffered a cost at identifying turns exchanges when visual cues from the listener were not available. Overall, we demonstrate that although auditory information is sufficient for successful conversation, visual information plays an important role in the overall efficiency of communication.

Affiliations: 1: Queen’s University Department of Psychology, Humphrey Hall 307, 62 Arch Street, Kingston, ON K7L3K9, Canada

*To whom correspondence should be addressed. E-mail: n.latif@queensu.ca
Loading

Full text loading...

/content/journals/10.1163/22134808-00002582
Loading

Data & Media loading...

1. Baayen R. H., Davidson D. J., Bates D. M. (2007). "Mixed-effects modeling with crossed random effects for subjects and items", J. Mem. Lang. Vol 59, 390412. [Crossref]
2. Barkhyusen P., Krahmer E., Swerts M. (2010). "Crossmodal and incremental of audiovisual cues to emotional speech", Lang. Speech Vol 53, 330. [Crossref]
3. Baron-Cohen S. (1988). "Social and pragmatic deficits in autism: cognitive or affective?" J. Autism Dev. Disord. Vol 18, 379402. [Crossref]
4. Bavelas J., Chovil N., Coates L., Roe L. (1995). "Gestures specialized for dialogue", Personal. Soc. Psychol. Bull. Vol 21, 394405. [Crossref]
5. Bavelas J., Coates L., Johnson T. (2002). "Listener responses as a collaborative process: the role of gaze", J. Comm. Vol 52, 566580.
6. Bögels S., Magyari L., Levinson S. (2015). "Neural signatures of response planning occur midway through an incoming question in conversation", Nat. Sci. Rep. Vol 5, 12881. DOI:10.1038.srep12881.
7. Brysbaert M. (2007). “Thelanguage-as-fixed-effect fallacy”: some simple SPSS solutions to a complex problem (Version 2.0). Technical Report. Royal Holloway, University of London, UK.
8. Clark H. H. (1973). "The language-as-fixed-effect fallacy: a critique of language statistics in psychological research", J. Verbal Learning Verbal Behav. Vol 12, 335359. [Crossref]
9. Clark H. H., Brennan S. E. (1991). "Grounding in communication", Perspect. Soc. Shared Cogn. Vol 13, 127149. [Crossref]
10. Coates J., Sutton-Spence R. (2001). "Turn-taking patterns in deaf conversation", J. Sociolinguist. Vol 5, 507529. [Crossref]
11. Collignon O., Girard S., Gosselin F., Roy S., Saint-Amour D., Lassonde M., Lepore F. (2008). "Audio–visual integration of emotion expression", Brain Res. Vol 1242I, 126135. [Crossref]
12. Conrey B., Pisoni D. B. (2006). "Auditory–visual speech perception and synchrony detection in speech and nonspeech signals", J. Acoust. Soc. Am. Vol 119, 40654073. [Crossref]
13. De Gelder B., Vroomen J. (2000). "The perception of emotions by ear and by eye", Cogn. Emot. Vol 14, 289311. [Crossref]
14. De Ruiter J. P., Mitterer H., Enfield N. J. (2006). "Projecting the end of a speaker’s turn: a cognitive cornerstone of conversation", Language Vol 82, 515535. [Crossref]
15. De Vos C., Torreira F., Levinson S. C. (2015). "Turn-timing in signed conversation: coordinating stroke-to-stroke turn boundarie", Front. Psychol. Vol 6, 268. DOI:10.3389/fpsycg.2015.00268.
16. Dias J. W., Rosenblum L. D. (2011). "Visual influences on interactive speech alignment", Perception Vol 40(12), 14571466. [Crossref]
17. Duncan S. (1972). "Some signals and rules for taking speaking turn in conversations", J. Pers. Soc. Psychol. Vol 23, 283292. [Crossref]
18. Forster K. I., Forster J. C. (2003). "DMDX: a Windows display program with millisecond accuracy", Behav. Res. Meth. Vol 35(1), 116124. [Crossref]
19. Garrod S., Pickering M. J. (2004). "Why is conversation so easy?" Trends Cogn. Sci. Vol 8, 811. [Crossref]
20. Goodwin C. (1981). Conversational Organization: Interaction Between Speakers and Hearers. Academic Press, Cambridge, MA, USA.
21. Gravano A., Hirschberg J. (2011). "Turn-taking cues in task-oriented dialogue", Comp. Speech Lang. Vol 25, 601634. [Crossref]
22. Grosjean F. (1996). "Gating", Lang. Cogn. Proc. Vol 11, 597604. [Crossref]
23. Hadar U., Steiner T. J., Grant E. C., Rose F. C. (1984). "The timing of shifts of head postures during conversation", Hum. Mov. Sci. Vol 3, 237245. [Crossref]
24. Hawkins K. (1991). "Some consequences of deep interruption in task-oriented communication", J. Lang. Soc. Psychol. Vol 10, 185203. [Crossref]
25. Heldner M., Edlund J. (2010). "Pauses, gaps and overlaps in conversations", J. Phon. Vol 38, 555568.
26. Holler J., Kendrick K. H. (2015). "Unaddressed participants’ gaze in multi-person interaction: optimizing recipiency", Front. Psychol. Vol 6, 98. DOI:103389/fpsycg.2015.00098.
27. Indefrey P., Levelt W. J. (2004). "The spatial and temporal signatures of word production components", Cognition Vol 92, 101144. [Crossref]
28. Keitel A., Daum M. M. (2015). "The use of intonation of turn anticipation in observed conversations without visual signals as source of information", Front. Psychol. Vol 6, 108. DOI:10.3389/fpsyg.2015.00108. [Crossref]
29. Kendon A. (1967). "Some functions of gaze-direction in social interaction", Acta Psychologica Vol 26, 2263. [Crossref]
30. Kendon A. (1972). "Some relationships between body motion and speech", in: Studies in Dyadic Communication, Siegman A. W., Pope B. (Eds), pp.  177210. Pergamon Press, New York, NY, USA. [Crossref]
31. Knoblich G., Flach R. (2001). "Predicting the effects of actions: interactions of perception and action", Psychol. Sci. Vol 12, 467472. [Crossref]
32. Koiso H., Horiuchi Y., Tutiya S., Ichikawa A., Den Y. (1998). "An analysis of turn-taking and backchannels based on prosodic and syntactic features in Japanese map task dialogs", Lang. Speech Vol 41, 295321. [Crossref]
33. Kraut R. E., Fussell S. R., Siegel J. (2003). "Visual information as a conversational resources in collaborative physical tasks", Hum.-Comp. Interact. Vol 18, 1349. [Crossref]
34. Levinson S. C. (2016). "Turn-taking in human communication — origins and implications for language processing", Trends Cogn. Sci. Vol 20, 614. [Crossref]
35. Magyari L., De Ruiter J. P. (2012). "Prediction of turn-ends based on anticipation of upcoming words", Front. Psychol. Vol 3, 376. DOI:103389/fpsyg.2012.00376. [Crossref]
36. Magyari L., Bastiaansen M. C., De Ruiter J. P., Levinson S. C. (2014). "Early anticipation lies behind the speed of response in conversation", J. Cogn. Neurosci. Vol 26, 25302539. [Crossref]
37. Massaro D. W. (1998). Perceiving Talking Faces: from Speech Perception to a Behavioral Principle. MIT Press, Cambridge, MA, USA.
38. McGurk H., MacDonald J. (1976). "Hearing lips and seeing voices", Nature Vol 264(5588), 746748. [Crossref]
39. McLaughlin M. L., Cody M. J. (1982). "Awkward silences: behavioral antecedents and consequences of the conversational lapse", Hum. Comm. Res. Vol 8, 299316. [Crossref]
40. McNeill D. (1992). Hand and Mind: What Gestures Reveal About Thought. The University of Chicago Press, Chicago, MA.
41. Mixdorff H., Honemann H., Kim J., Davis C. (2015). Anticipation of turn-switching in auditory-visual dialogs, in: Proceedings of 1st Joint Conference on Facial Analysis, Animation and Auditory-Visual Speech Processing, pp. 52–56. Vienna, Austria.
42. Munhall K. G., Jones J. A., Callan D. E., Kuratate T., Vatikiotis-Bateson E. (2004). "Visual prosody and speech intelligibility: head movement improves auditory speech perception", Psychol. Sci. Vol 15, 133137. [Crossref]
43. Nenkova A., Gravano A., Hirschberg J. (2008). High frequency word entrainment in spoken dialogue, in: Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics on Human Language Technologies, pp. 169–172. Columbus, OH, USA.
44. Ohsuga T., Nishida M., Horiuchi Y., Ichikawa A. (2005). "Investigation of the relationship between turn-taking and prosodic features of spontaneous dialogue", in: Interspeech 2005, pp.  3336. Lisbon, Portugal.
45. Riest C., Jorschick A. B., De Ruiter J. P. (2015). "Anticipation in turn-taking: mechanisms and information sources", Front. Psychol. Vol 6, 89. DOI:103389/fpsycg.2015.00089. [Crossref]
46. Rose D., Clarke T. J. (2009). "Look who’s talking: visual detection of speech from whole-body biological motion cues during emotive interpersonal conversation", Perception Vol 38, 153156. [Crossref]
47. Rosenblum L. D. (2005). "Primacy of multimodal speech perception", in: The Handbook of Speech Perception, Pisoni D. B., Remez R. E. (Eds), pp.  5178. Blackwell, Oxford, UK. [Crossref]
48. Rutherford M. D., Kuhlmeier V. A. (2013). Social Perception: Detection and Intepretation of Animacy, Agency and Intention. MIT Press, Cambridge, MA, USA. [Crossref]
49. Sacks H. (1992). Lectures on Conversation, Vol Vol. 1. Blackwell, Oxford, UK.
50. Sacks H., Schegloff E. A., Jefferson G. (1974). "A simplest systematics for the organization of turn-taking for conversation", Language Vol 50(4), 696735. [Crossref]
51. Stivers T., Enfield N. J., Brown P., Englert C., Hayashi M., Heinemann T., Hoymann G., Rossano F., De Ruiter J. P., Yoon K. E., Levinson S. C. (2009). "Universals and cultural variation in turn-taking in conversation", Proc. Natl. Acad. Sci. USA Vol 106, 1058710592. [Crossref]
52. Sumby W. H., Pollack I. (1954). "Visual contribution to speech intelligibility in noise", J. Acoust. Soc. Am. Vol 26, 212215. [Crossref]
53. Ten Bosch L., Oostdijk N., Boves L. (2005). "On temporal aspects of turn taking in conversational dialogues", Speech Comm. Vol 47, 8086. [Crossref]
54. Thomas A. P., Bull P. (1981). "The role of pre-speech posture change in dyadic interaction", Br. J. Soc. Psychol. Vol 20, 105111. [Crossref]
55. Torreira F., Valtersson V. (2015). "Phonetic and visual cues to questionhood in French conversation", Phonetica Vol 72, 2042. [Crossref]
56. Verbrugge R. R. (1985). Language and event perception: steps toward a synthesis, in: Persistence and Change: Proceedings of the First International Conference on Event Perception, W. H. Warren and R. E. Shaw (Eds), pp. 157–194. Lawrence Erlbaum Associates Publishers, Hillsdale, NJ, USA.
http://brill.metastore.ingenta.com/content/journals/10.1163/22134808-00002582
Loading

Article metrics loading...

/content/journals/10.1163/22134808-00002582
2017-10-05
2018-10-16

Sign-in

Can't access your account?
  • Tools

  • Add to Favorites
  • Printable version
  • Email this page
  • Subscribe to ToC alert
  • Get permissions
  • Recommend to your library

    You must fill out fields marked with: *

    Librarian details
    Your details
    Why are you recommending this title?
    Select reason:
     
    Multisensory Research — Recommend this title to your library
  • Export citations
  • Key

  • Full access
  • Open Access
  • Partial/No accessInformation