Here we assert that gesture is not just an accessory to the language system, but rather an integral partner in communication. A broader approach to the study of language provides insights into these rich communicative contexts. Here, language is a dynamic process that is locally constructed between communication partners and leverages multiple modalities of information, including gesture. Minimally, we have reviewed literature showing that gesture has essential communicative functions above and beyond speech, and therefore, researchers studying neurogenic communication disorders should work to also characterize the consequences of these disorders on the gestural modality.
Indeed, as a field, we know much less about gesture than spoken language in these disorders as well as knowing less about gesture than even other non-verbal aspects of communication such as eye-gaze or facial affect recognition.
Much of the research that exists on gesture in these populations has focused on characterizing spontaneous gesture production but often from an atheoretical perspective. Studying gesture in populations with impairments in language and cognition provides a unique opportunity to test hypotheses generated by various theoretical accounts in the gesture literature in healthy adults which suggests that gesture provides cognitive and linguistic benefits. Indeed, despite well-documented deficits in memory and social communication after cognitive-communication disorders, researchers have not explored whether patterns of brain injury or cognitive deficit predict gesture use or if gestures can improve memory and communicative function in these individuals.
Of all the functions of gesture described here, perhaps the most exciting is the potential benefit of gesture on learning and memory and the implications that this might have for clinical practice. The success of all behavioral therapy depends on the ability of the patient to learn and remember the targeted skills.
Yet, rather than incorporating gesture into our interventions, some therapy protocols have inhibited it. This seems counterproductive in light of the potent role of gesture in learning and memory. Rather than discouraging gesture production, it may be more useful to consider the synergistic nature of speech and gesture and explore ways to leverage gesture to achieve various intervention goals across disciplines.
To date, the bulk of the theoretical and empirical work on co-speech gesture has been from a cognitive or psychological perspective rather than a neural correlates perspective. Thus, while we know a lot about the cognitive and communicative benefits of gesture, we know less about the neural mechanisms that support them. Applying the psychological literature of gesture to neurogenic communication disorders not only has the potential to improve treatment, but also provides an opportunity to generate and advance theories of co-speech gesture that are psychologically and biologically plausible.
While this review identifies several gaps in the neurogenic communication disorder literature, it highlights an exciting opportunity to consider neurogenic communication disorders from a new perspective. Our hands shape and actively alter our own learning and display traces of that learning in conversation, reflecting our prior experiences and depicting knowledge even for things the speaker may not be explicitly aware of or cannot yet communicate in speech.
In addition to supporting learning and memory, gesture facilitates the exploration of ideas especially when it comes to visuo-spatial problem solving and complex reasoning. Yet, we know little about how gesture interacts with cognition in clinical populations, and this is critical to fully understand language, cognition, and communication, and its disorders. Thus, gesture deserves more of our attention in the study of neurogenic communication disorders.
Future research should systematically assess the impact of cognitive and communication disorders on gesture production in larger group studies as well as empirically testing the functions of gesture for language use and social cognition. Such research would shed light on the untapped potential of gesture in understanding and rehabilitating neurogenic communication disorders.
SC and MD planned the scope and content of the review. SC wrote the initial version of the manuscript. MD contributed to the final version of the manuscript in writing and editing. Both authors contributed to the article and approved the submitted version. The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The relationship between co-speech gesture production and macrolinguistic discourse abilities in people with focal brain injury. Neuropsychologia , — Akhavan, N. Integrity and function of gestures in aphasia. Aphasiology 32, — Alibali, M. Gesture in spatial cognition: expressing, communicating, and thinking about spatial information.
Illuminating mental representations through speech and gesture. Effects of visibility between speaker and listener on gesture production: some gestures are meant to be seen. Gesture highlights perceptually present information for speakers. Gesture 10, 3— Argyriou, P. Google Scholar.
Hand matters: left-hand gestures enhance metaphor explanation. Attard, M. Aphasiology 27, 80— Aubert, S. Bara, B.
Neuropragmatics: extralinguistic communication after closed head injury. Brain Lang. Bates, E. Language and Context: The Acquisition of Pragmatics. Bavelas, J. Interactive gestures. Discourse Process. Beattie, G. An experimental investigation of the role of iconic gestures in lexical access using the tip-of-the-tongue phenomenon.
Beaudoin-Ryan, L. Teaching moral reasoning through gesture. Behrmann, M. Non-verbal communication of aphasic patients. Bigler, E. Traumatic brain injury and memory: the role of hippocampal atrophy. Neuropsychology 10, — Blake, M. An evidence-based systematic review on communication treatments for individuals with right hemisphere brain damage. Speech Lang. Perspectives on treatment for communication deficits associated with right hemisphere brain damage.
Aphasiology 16, — Blonder, L. Spontaneous gestures following right hemisphere infarct. Neuropsychologia 33, — Bond, F. Conversation with traumatically brain-injured individuals: a controlled study of behavioural changes and their impact. Brain Inj. Boo, M. Aphasiology 25, — Bowen, A. Cognitive rehabilitation for spatial neglect following stroke.
Cochrane Database Syst. Braak, H. Neuropathological stageing of Alzheimer-related changes. Acta Neuropathol. Broaders, S. Making children gesture brings out implicit knowledge and leads to learning.
Truth is at hand: How gesture adds information during investigative interviews. Brown, A. A review of the tip-of-the-tongue experience. Brown-Schmidt, S. Memory and common ground processes in language use. Byom, L. Facial emotion recognition of older adults with traumatic brain injury. Campisi, E. Iconicity as a communicative strategy: recipient design in multimodal demonstrations for adults and children. Cannizzaro, M. Perceptions of communicative competence after traumatic brain injury: implications for ecologically-driven intervention targets.
Capone, N. Gesture development. Carlomagno, S. Semantic attributes of iconic gestures in fluent and non-fluent aphasic adults. Cortex 41, — Cassell, J. Speech-gesture mismatches: evidence for one underlying representation of linguistic and nonlinguistic information. Channon, S. Social cognition after head injury: sarcasm and theory of mind.
Chawla, P. Gesture and speech in spontaneous and rehearsed narratives. Chu, M. Church, R. The mismatch between gesture and speech as an index of transitional knowledge. Cognition 23, 43— The role of gesture and speech communication as a reflection of cognitive understanding.
Legal Issues 6, — Ciccone, N. Constraint-induced aphasia therapy CIAT : a randomised controlled trial in very early stroke rehabilitation. Aphasiology 30, — Cicone, M. The relation between gesture and language in aphasic communication. Clark, H. LeNy, and W. Cocks, N. The impact of impaired semantic knowledge on spontaneous iconic gesture production.
Aphasiology 27, — The relationship between right hemisphere damage and gesture in spontaneous discourse.
Aphasiology 21, — Coelho, C. Conversational discourse in closed-head-injured and non-brain-injured adults. Cohen, N. Preserved learning and retention of pattern-analyzing skill in amnesia: dissociation of knowing how and knowing that. Science , — Memory, Amnesia, and the Hippocampal System.
Cambridge: MIT Press. Cook, S. Consolidation and transfer of learning after observing hand gesture. Child Dev. The role of gesture in learning: Do children use their hands to change their minds? Gesturing makes learning last. Cognition , — Gestures, but not meaningless movements, lighten working memory load when explaining math. Corballis, M. The gestural origins of language. Wiley Interdiscip. How language evolved from manual gestures. Gesture 12, — Towards a description of clinical communication impairment profiles following right-hemisphere damage.
Dahl, T. Dargue, N. When our hands help us understand: a meta-analysis into the effects of gesture on comprehension. Davis, G. PACE revisited. Aphasiology 19, 21— Referential cohesion and logical coherence of narration after closed head injury. De Beer, C. How much information do people with aphasia convey via gesture?
The production of gesture and speech by people with aphasia: influence of communicative constraints. Church, M. Alibali, and S. Kelly Amsterdam: John Benjamins , 59— A critical evaluation of models of gesture and speech production for understanding gesture in aphasia. Narrative processing in typically developing children and children with early unilateral brain injury: seeing gesture matters. Deweer, B. Psychiatry 58, — Difrancesco, S.
Intensive language-action therapy ILAT : the methods. Aphasiology 26, — Dipper, L. The language-gesture connection: evidence from aphasia. Drijvers, L.
Visual context enhanced: the joint contribution of iconic gestures and visible speech to degraded speech comprehension. Duff, M. The hippocampus and the flexible use and processing of language. Hannula, and M. Duff Cham: Springer International Publishing , — The use of definite references signals declarative memory: evidence from patients with hippocampal amnesia. Development of shared information in communication despite hippocampal amnesia.
Efstratiadou, E. A systematic review of semantic feature analysis therapy studies for aphasia. Eichenbaum, H. Oxford: Oxford University Press. Engberg, A. Psychological outcome following traumatic brain injury in adults: a long term population-based follow-up. Evans, K. Comprehension of indirect requests by adults with severe traumatic brain injury: contributions of gestural and verbal information. Ewing-Cobbs, L. Social communication in young children with traumatic brain injury: relations with corpus callosum morphometry.
Feyereisen, P. Manual activity during speaking in aphasic subjects. Gestures and Speech: Psychological Investigations. Mental imagery and production of hand gestures while speaking in younger and older adults. Fleminger, S. Psychiatry 74, — Fonseca, J. Cognitive performance in aphasia due to stroke: a systematic review. Frick-Horbury, D. The effects of restricting hand gesture production on lexical retrieval and free recall. Galati, A. Garber, P. Gesture offers insight into problem-solving in adults and children.
Gerwing, J. Gesture 4, — Gillespie, M. Verbal working memory predicts co-speech gesture: evidence from individual differences.
Glenberg, A. Action-based language: a theory of language acquisition, comprehension, and production. Cortex 48, — Glosser, G. Goldin-Meadow, S. Transitions in concept acquisition: using the hand to read the mind. Gesturing gives children new ideas about math. Explaining math: gesturing lightens the load. Goodwin, C. Graham, J. The effects of elimination of hand gestures and of verbal codability on speech performance. Graziano, M. When speech stops, gesture stops: evidence from developmental and crosslinguistic comparisons.
Hadar, U. Ideational gestures and speech in brain-damaged subjects. Iconic gestures, imagery, and word retrieval in speech. Semiotica , — Hallowell, B. Hauk, O. Somatotopic representation of action words in human motor and premotor cortex.
Neuron 41, — Hayes, J. Traumatic brain injury as a disorder of brain connectivity. Henke, K. Body language goes a long way in leaving a powerful impression on people you come across in your day to day life.
An important part of body language is your gestures, which include body movements from twitching your eyebrows, to fidgeting with your fingers. Gestures become a powerful tool of communication in the hands of an expert and sometimes, convey much more than words. Gestures constitute non-verbal communication, which complement verbal modes of communication. They have a defining impact on how one receives words and can make or break the impact of the spoken word.
Gestures include any intentional or unintentional body movement made during the course of a conversation. In a formal scenario, such as interviews, gestures have a big role to play, more so, because the candidates, in their nervousness, do not pay attention to the way their body reacts. This exposes them to the critical eyes of the people on the other side and enables them to form certain impressions about the candidates.
For instance, pointing your fingers in a formal set-up is considered rude. Action arrows showed movement, for example, of a part or the flow of a mixture of air and fuel.
Action effects were depictions of actions, such as ignition, explosion or compression as in the bubbles and jagged circle in Fig. Labeling arrows or lines connected names with the corresponding depiction of a part as in Fig. Poisson regression analyses were used to model count variables under the assumption that the conditional means equal the conditional variances.
The means of the diagram components by gesture condition appear in Fig. Thus, effects of the viewed gestures were apparent in the diagrams. Those who saw action gestures showed far more action in their diagrams in the form of arrows showing direction of movement and depictions of actions. Conversely, those who saw structure gestures used more lines to label parts.
Mean number of visual component types produced in visual explanations by viewed gestures. Error bars represent standard errors of the means. Diagrams were coded as complete if they included all four steps of the process and incomplete otherwise. The names of the steps alone did not count as complete.
One video was not recorded due to equipment malfunction. Two participants from the action group and three from the structure group never used their hands but were included in the analyses because not producing gestures is a behavioral pattern, if an infrequent one. The average explanation time was Figure 6 shows the mean numbers of gestures of each type by viewed gesture. Mean number of action and structure gestures produced by viewed gesture. Produced gestures were coded as action or structure.
An example of each appears in Fig. Examples of gestures produced in videoed explanations. The left panel shows a participant making an action gesture; the right panel shows a participant making a structure gesture. Irrespective of viewing condition, participants produced more action gestures than structure gestures. Participants who had viewed action gestures produced an average of Those who had viewed structure gestures produced an average of Although the differences in explanation time were not significant, explanations by participants who had viewed action gestures were on average longer.
Therefore, the analyses were repeated on gesture rate. The same findings emerged. Those who had viewed action gestures produced 7. Those who had viewed structure gestures produced of 4. Combining both groups, we found that gesture use correlated with number of visual components in the diagrams and with scores on the knowledge test, evidence that better understanding is also expressed visually, in gestures and diagrams.
Most of the gestures participants produced were inventions, not imitations of what they had seen. In communicative situations, gesture mimicry is common e. Thus, invented gestures are especially indicative of deep understanding because they are creations of the individuals from their own understanding rather than copies of what they viewed.
Any structural gestures produced by those who viewed the action gesture video and any action gestures produced by those who viewed the structure gesture video were a-priori invented. Also, any additional action gestures produced by those who viewed action gestures, or additional structural gestures produced by those who viewed structure gestures, were coded as invented. Finally, gestures that used different hand shapes from those that had been viewed were coded as invented.
For example, the speaker in panel a of Fig. This gesture was coded as imitated because in the action instructional video the speaker spread his right hand with the palm up and moved it upward in the same way. In contrast, the participant in Fig. An invented gesture demonstrating a piston moving up cf. Gestures similar in hand shape to those viewed were coded as imitated.
The frequencies of invented and imitated gestures are shown in Fig. Those who saw action gestures produced Those who had viewed structure gestures produced Average invented and imitated gestures by gesture type and viewed gesture.
Analyses of gesture rate corroborated most of these findings. Those who had viewed action gestures produced an average of 6.
Those who had viewed structure gestures produced an average of 5. There were no differences in gesture rate by viewed gesture. Supporting the claims that action information is both more important and harder to convey, of the total of information units in the speech corpus, conveyed action , conveyed structure , and conveyed other information.
Those who had viewed action gestures produced a total of information units, conveying action, conveying structure, and conveying other information. Those who had viewed structure gestures produced a total of information units, conveyed action, conveyed structure, and 97 conveyed other information. Figure 10 shows the mean types of information produced by those who had viewed action and structure gestures. Those who had viewed action conveyed relatively more action information in their speech than those who viewed structure gestures.
Mean number of information units by information type in the two groups. Because those who had viewed action gestures produced more speech, the proportions of action, structure, and other information units were analyzed by viewed gesture; the means appear in Fig.
Mean percentage of information units by the two groups. Those who had viewed action gestures spoke relatively more about action than those who had viewed structure gestures; similarly, those who had viewed structure gestures spoke relatively more about structure. Dynamic systems are pervasive in our lives, but are often difficult to understand. Dynamic systems typically have a structural layer, the parts and their interrelations, as well as a dynamic layer, the actions, changes, behaviors, processes, consequences and causes that occur over time.
The structural layer is normally easier to convey and easier to comprehend than the dynamic layer. The structural layer is static but the dynamic layer can include many different kinds of actions and contingencies or consequences. Here, we asked whether accompanying explanations of dynamic systems with a sequence of gestures that represent the actions of the parts of the system could enhance understanding of the dynamics of the system. One group of students watched an explanation accompanied by gestures representing the actions of the parts; another group of students watched the same verbal explanation but accompanied by gestures showing the forms of the parts and their spatial array.
The verbal explanation was the same for both explanations. Both types of gestures are common in spontaneous explanations. A schematic diagram of the spatial array of the names of the parts accompanied both explanations.
The deeper understanding of the dynamics of the system was expressed in many ways; first, in better performance on questions about the action of the system, questions that could be answered solely from the shared verbal script. The deeper understanding was revealed in their diagrams, in their words, and in their own invented gestures.
The visual explanations of those who had seen action gestures contained more arrows indicating direction of movement; they also contained more depictions of the various actions, such as explosion or ignition. Importantly, seeing action gestures provided participants with more complete and comprehensive understandings of the system. The visual explanations revealed that far more of those who had seen action gestures distinguished and included all four stages of the system than those who had viewed structure gestures.
In their own oral explanations, participants from both groups devoted three times as many words and three times as many gestures to explaining the dynamics of the system as to explaining the structure. This is dramatic evidence that the dynamics of a complex system require more explanation than the structure. Deeper understanding of the system dynamics was evident in the oral explanations of the systems by participants who had seen action gestures.
Their explanations contained more words expressing action, despite having heard the same words as those who had viewed gestures conveying structure.
Both groups accompanied their explanations with many gestures, more for action than for structure. The majority of gestures produced by participants in both groups were inventions by the participants. The gestures produced had different forms hand shapes from those they had seen; that is, they were not close copies of viewed gestures. Overall, the results demonstrate far-reaching effects of action gestures on understanding. Because the language was the same for both groups, gesture affects understanding over and beyond language.
Watching an explanation of a dynamic system accompanied by gestures representing the sequence of actions led to deeper understanding of the dynamics of the system compared to seeing gestures representing the structure of the parts. The deeper understanding was reflected in a better grasp of the stages of the system, better performance on questions about the dynamics of the system, and more action information expressed in diagrams, words and invented gestures.
Gestures conveying structure had little effect on understanding structure, nor were any effects expected. Structural information is easier to grasp than dynamic information, and a diagram showing structure was used in the viewed explanation. Numerous studies have shown that people express information in gestures that they do not express in speech, important information about their thinking, including structure, action, and more e.
Integrated sequences of gestures can create virtual models of complex spaces or complex sequences of actions Emmorey et al. Here, we transferred gestures for expression to gestures for teaching and learning; we found that an integrated series of gestures congruent with action can deepen understanding of the actions of a dynamic system.
This study is by no means the first to demonstrate the power of gesture to instill knowledge. Examples abound, in math e. Expressing knowledge visually by means of gesture bears similarities to expressing knowledge in graphics. Both gestures and graphics can abstract, segment, and integrate information to be conveyed or understood e. Diagrams are typically multimodal, incorporating and integrating both marks in space, their sizes, formats, and places in space, and also words, symbols, and more to create complete messages.
So, too, are gestures; they are typically an integral part of a complete multimodal message e. In much diagrammatic communication—think of maps, science diagrams, assembly instructions—the visual-spatial features of meaning form the core of the message; the words and symbols annotate e. There are parallel cases for gesture; that is, the sequence of gestures form the core of the communication, and the words serve to annotate the gestures e.
In many instances, the three—gesture, talk, and diagram—work together, complementing and supplementing each other e. Examples abound, for example, in sports, dance, musical instruments, cooking, and more. The present results along with the previous studies make a strong case for incorporating well-crafted gestures and other forms of visual communication in teaching, especially of dynamic systems that entail actions in time.
In most, if not all, cases, the use of gesture to form the core of messages or to complement, disambiguate, and enrich words seems to be because the information is easier to express and more precise in gesture than it words. In other words, it is more direct and more natural to show than to tell. In addition, information about space and action is often far more precise in gesture than in words.
Pointing to the exact position of a part of an object can be more precise that describing the position; showing the motion of an object can be more precise than describing the motion.
As Talmy , , analyzed and others have documented e. Do I push with a finger or a hand or a handle? With one hand or two? Thus, the spatial and action information conveyed in gesture disambiguates and clarifies information that may be ambiguous or imprecise in speech, yielding greater accuracy in communication e. Heiser et al. Just as gestures are effective in communicating information more precisely and directly to others, they are also more effective than words alone in comprehending and communicating information for self e.
Many have analyzed the close connections between gesture and action, calling attention to phenomena like motor resonance e. Building on those insightful analyses, we propose a more direct relationship between gestures and representations of space and action. Concepts of space and action and much more map naturally and directly to places and actions of the hands and the body Cartmill et al.
The hands and the body both are in places and act in space and, therefore, can readily represent places and actions in space. This natural mapping as well as the increased precision of gesture over words makes gestures ideal for representing space and action both for self and for others.
Gestures express meanings directly and, in some cases, can prime the relevant words e. Gestures are primary to meaning, not secondary. Before there were words, there were gestures, both ontogenetically and phylogenetically e. Babies typically gesture before they speak e. In an analysis of the evolution of language drawing on the neurological basis of mirror neurons, Rizzolatti and Arbib postulate that gestures, especially action gestures, grew out of abbreviated actions.
Given that the same neurons in the premotor cortex in monkeys fire when monkeys perform hand actions and view hand actions, abbreviated hand actions could be used to communicate and understand intentions to perform specific actions. Communication canonically began face-to-face in small groups. Face-to-face communication occurs in specific contexts, often around a task or topic related to the context.
Aspects of context can be and are used in conversations, pointed to, manipulated, and often given new meanings e. As such, face-to-face communication could and can rely on gestures and props, using gestures to bring props in the context into the conversation.
In fact, our vocabularies for certain domains are sparse, crude, abstract and ambiguous, even for concrete domains central to our existence, faces, space, and action. Gestures can be more precise and show more nuances than words.
Gestures are actions in space, and thereby provide a natural and direct mapping for representing space and action. Gestures are powerful tools for thinking and communicating because they both represent and resemble. Alibali, M. The function of gesture in learning to count: more than keeping track.
Cognitive Development, 14 , 37— Article Google Scholar. Spontaneous gestures influence strategy choices in problem solving. Psychological Science, 22 , — Article PubMed Google Scholar. Illuminating mental representations through speech and gesture. Psychological Science, 10 , — Atit, K. Student gestures aid penetrative thinking. Journal of Geoscience Education, 63 , 66— Bavelas, J. Gestures as part of speech: Methodological implications.
Research on Language and Social Interaction, 27 , — Beattie, G. Do iconic hand gestures really contribute anything to the semantic information conveyed by speech? An experimental investigation. Semiotica, , 1— Beilock, S. Gesture changes thought by grounding it in action. Psychological Science, 21 , — Call, J.
The gestural communication of apes and monkeys. Google Scholar. Card, S. Readings in information visualization: Using vision to think.
San Francisco: Morgan Kaufman. Carlson, R. What do the hands externalize in simple arithmetic? My Account. Here are some important milestones in the development of gestures [1,2]: At approximately 10 months, children begin to draw attention or point to objects or events — first children learn to show by holding up an object , then give by giving an object to someone , and finally point toward a specific object, location, or event [3]. These types of gestures are used before children start to talk.
Between 12 and 18 months, children rarely use gestures and spoken words at the same time At around 18 months, children begin to combine gestures and words — The gestures a child uses — even before he says his first word — tell us a lot about how his communication development is progressing. Using simple gestures while you talk with your child throughout the day will help build his communication skills. Share This Page.
0コメント