Gestures and speech – a balanced couple?
Could you try to explain to me how you can walk from here to where you live? Or how would you prepare your favorite dish? Could you reason what you think about life sentences? If you are asked questions like the ones I just asked, how would you answer them? Chances are that you are using your hands while explaining.
Entangled development of speech and gestures
Speech and gestures are tightly coupled when you communicate, explain something, or learn [1]. This tight coupling between the hands and mouth is already apparent in very young children. I know this from my day to day experience, as I witnessed my little one a few months ago, put her hands in her mouth all the time. Actually, at this moment she is making music by ‘singing’ “bababawawawa” and slamming two cups against each other – both rhythmic activities. Around 9 to 14 months of age, children’s first words and gestures emerge. Typically, gestures precede and hugely outnumber speech during this period, with pointing gestures preceding the word for an object by 3 months, on average [2]. A nice illustration of gestures’ role in language learning can be found in the video-clip below.
Iverson and Thelen [3] propose that, when children learn to speak, gestures are ahead of speech because they are more easy to perform and have been performed way more often. Keep in mind that pointing originates from reaching, which in turn emerges from the grasp reflex that is present in new borns. Verbal utterances – like words – follow gestures, but at first they are much more difficult to execute. When children get older and their language skills develop, speech has been performed increasingly often and becomes increasingly more easy. This leads to the point at which the activation of all the components involved in speech, such as the mouth, nerves, jaw, muscles and vocal cords, becomes so high that it activates and captures the tightly coupled components that are involved in gestures, such as bones, muscles, and nerves of the hands, arm, and shoulder. As a result, speech and gestures are synchronized, as is usually the case when people communicate.
Gestures and learning
Interestingly, when people – both children and adults – learn something new, speech and gestures tend to dissociate again. During these so-called gesture-speech mismatches, the meaning conveyed by gestures is different from the meaning conveyed by speech [4][5]. Moreover, during these mismatches people express new insights in gestures before they are able to put them into words [6]. In previous studies, the explanation for this leading role of gestures in cognitive development has been attributed to gestures being a medium to express arising cognitive strategies [7], to highlight cognitively relevant aspects [8], to add action information to existing mental representations [9], to simulate actions [10] to decrease cognitive load during tasks [11] and to construct cognitive insight [12][13][14][15].
In our study [16], we investigated if and how the leading role of gestures appears from the real-time interplay between speech and gestures as children perform a science task. Inspired by the theory of complex, dynamic systems, with one of its key principles that components of a system self-organize to form higher order patterns, we propose speech and gestures to be two coupled synergies. A synergy is a stable organization of components that self-organizes to perform a specific task [17][18][19]. As described above, several components are involved in speech and gestures. Some of these components overlap, such as for instance Broca’s area, while other components are only involved in either speech or gestures, such as, respectively, muscles in the face or in the hands, for example. In the research paper, we outline how gesture-speech mismatches can be explained from a synergetic perspective, but for now I will leave it at that.
Our study
To analyze the dynamic interplay, we coded gestures and speech of 12 children, aged 5 or 6. For each child, we transformed the codes to two time series and assigned levels of understanding to both gestures and speech. These levels of understanding are comparable, so we could move to our next step: calculating how behavioral states –understanding levels– of gestures repeatedly occurred with behavioral states of speech, and vice versa. This technique is called Cross Recurrence Quantification Analysis (CRQA), and renders measures about how two systems’ –gestures and speech– behavior is coupled, in terms of stability, rigidity and complexity. Moreover, the CRQA measures inform about the relative strength and direction of the coupling between speech and gestures. For a detailed description of the measures, I would like to refer you to the research paper again.
In sum, our results suggest that speech and gestures were more synchronized and tightly coupled, whereby speech leads relatively stronger over gestures for older children (i.e., age 6) and children who performed well on past language tasks. We propose that coordination of the speech and gesture systems of these children is more developed. Furthermore, our results suggest that the coupling between speech and gestures is relatively more balanced for children who performed well on past science tasks and math tasks. We speculate that, when speech is relatively less leading, higher-order understanding has more room to emerge from actions through gesturing.
Studying this will keep my hands full for a while…
We hope that our study will inspire future work and builds bridges between existing literature on gestures, and cognition from a dynamic perspective. Considering gestures, speech and learning as embodied processes is essential for our fundamental and applied understanding of how learning comes about. Studying this will keep my hands full for a while…
Relevant links and publications
[1] Goldin-Meadow, S., Wein, D., and Chang, C. (1992). Assessing knowledge through gesture: using children’s hands to read their minds. Cogn. Instr. 9, 201–219. doi: 10.1207/s1532690xci0903_2
[2] Iverson, J. M., and Goldin-Meadow, S. (2005). Gesture paves the way for language development. Psychol. Sci. 16, 367–371. doi: 10.1111/j.0956-7976.2005.01542.x
[3] Iverson, J. M., and Thelen, E. (1999). Hand, mouth and brain: the dynamic emergence of speech and gesture. J. Conscious. Stud. 6, 19–40.
[4] Church, R. B., and Goldin-Meadow, S. (1986). The mismatch between gesture and speech as an index of transitional knowledge. Cognition 23, 43–71. doi: 10.1016/0010-0277(86)90053-3
[5] Goldin-Meadow, S. (2003). Hearing Gesture: How Our Hands Help us Think. Cambridge, MA:Harvard University Press, 304.
[6] Garber, P., and Goldin-Meadow, S. (2002). Gesture offers insight into problem-solving in adults and children. Cogn. Sci. 26, 817–831. doi: 10.1207/s15516709cog2606_5
[7] Goldin-Meadow, S., Alibali, M. W., and Church, R. B. (1993). Transitions in concept acquisition: using the hand to read the mind. Psychol. Rev. 100, 279–297. doi: 10.1037/0033-295X.100.2.279
[8] Goldin-Meadow, S., Levine, S. C., Zinchenko, E., Yip, T. K., Hemani, N., and Factor, L. (2012). Doing gesture promotes learning a mental transformation task better than seeing gesture. Dev. Sci. 15, 876–884. doi: 10.1111/j.1467- 7687.2012.01185.x
[9] Beilock, S. L., and Goldin-Meadow, S. (2010). Gesture changes thought by grounding it in action. Psychol. Sci. 21, 1605–1610. doi: 10.1177/09567976 10385353
[10] Hostetter, A. B., and Alibali, M. W. (2010). Language, gesture, action! A test of the Gesture as Simulated Action framework. J. Mem. Lang. 63, 245–257. doi: 10.1016/j.jml.2010.04.003
[11] Goldin-Meadow, S., Nusbaum, H., Kelly, S. D., and Wagner, S. (2001). Explaining math: gesturing lightens the load. Psychol. Sci. 12, 516–522. doi: 10.1111/1467- 9280.00395
[12] Trudeau, J. J., and Dixon, J. A. (2007). Embodiment and abstraction: actions create relational representations. Psychon. Bull. Rev. 14, 994–1000. doi: 10.3758/BF03194134
[13] Stephen, D. G., Boncoddo, R. A., Magnuson, J. S., and Dixon, J. A. (2009). The dynamics of insight: mathematical discovery as a phase transition. Mem. Cogn. 37, 1132–1149. doi: 10.3758/MC.37.8.1132
[14] Stephen, D. G., Dixon, J. A., and Isenhower, R. W. (2009b). Dynamics of representational change: entropy, action, and cognition. J. Exp. Psychol. Hum. Percept. Perform. 35, 1811–1832. doi: 10.1037/a0014510
[15] Boncoddo, R., Dixon, J. A., and Kelley, E. (2010). The emergence of a novel representation from action: evidence from preschoolers. Dev. Sci. 13, 370–377. doi: 10.1111/j.1467-7687.2009.00905.x
[16] De Jonge-Hoekstra, L., Van der Steen, S., Van Geert, P., and Cox R.F.A. (2016) Asymmetric Dynamic Attunement of Speech and Gestures in the Construction of Children’s Understanding. Front. Psychol. 7:473. doi: 10.3389/fpsyg.2016.00473
[17] Haken, H. (1977/1983). Synergetics, An Introduction: Nonequilibrium Phase Transitions and Self-Organization in Physics, Chemistry and Biology. Berlin: Springer.
[18] Kugler, P. N., and Turvey, M. T. (1987). Information, Natural Law, and the Self- Assembly of Rhythmic Movement. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc
[19] Kelso, J. A. S. (1995). Dynamic Patterns: The Self-Organization of Brain and Behavior. Cambridge, MA: The MIT Press.