The Role of Speech-Gesture Synchrony in Clipping Words From the Speech Stream: Evidence From Infant Pupil Responses
How do young infants discover that a segment of the sound stream refers to a particular aspect of the visual world around them? Speakers do not enunciate each word separately, even to infants; rather, whattheysayrunstogether. To relate an object (say, an apple) to its referent, infants must notice the interactant's target of attention (the apple) and at the same time single out the word that refers to it as the other person speaks (Lookattheapple!). We contend that caregivers through their actions assist by directing and educating an infant's attention, particularly through the use of a show gesture. The onset/offset, rhythm, tempo, and duration of these show gestures are synchronous with the saying of the words referring to the target objects. Our prior research using eye tracking found that show gestures lead an infant to look at the object presented as the word for it is uttered and that show gestures facilitate word learning. In this research we tested the hypothesis that show gestures also lead to enhanced attentional processing as measured through pupil dilation. Comparing pupil diameters while words were introduced with a show, static, or asynchronous dynamic gesture, we found that pupil dilation occurred for the show gesture condition and was positively correlated with word learning.
Rader, Nancy de Villiers and Zukow-Goldring, Patricia, "The Role of Speech-Gesture Synchrony in Clipping Words From the Speech Stream: Evidence From Infant Pupil Responses" (2015). Faculty Articles Indexed in Scopus. 966.