The very first way babies communicate is through gestures (well, in addition to crying, that is). They may express “I want that” as a distinct pointing toward an object. As people grow older, gestures are not abandoned, but both spoken and body language become more ambiguous. Oftentimes, adults will point and gesture while talking without directly referring to anything present and at the same time use ambiguous words like “this” or “that.” A new study explores this ambiguity – finding that we are constantly evaluating the use of gestures and prioritizing their meaning based on how helpful they are over time.
“When you listen carefully to what a person utters, you will notice that there is a lot of ambiguity in speech,” says Thomas Gunter of the Max Planck Institute for Human Cognitive and Brain Sciences, co-author of the new study with J. E. Douglas Weinbrenner. “In our design, we used ambiguity to investigate the effects of abstract pointing.”
In the study, participants, watched videos in which an actress was interviewed about various topics, such as pets, by while undergoing EEG recordings. The actress pointed several times to the left side to refer to, for example, cats, and to the right to refer to dogs; in the first experiment, the pointing was sometimes consistent and other times not, while in the second experiment, the pointing stayed consistently reliable. After the last question of the interviewer (‘Which animal do you like the most?’), the interviewee would say something like: ‘I really like this animal because I have experienced that cats have a mind of their own’. The first part of the statement was clearly ambiguous (‘this animal’).
As published with in the Journal of Cognitive Neuroscience, in the first experiment, when pointing was unreliable, participants did not use the gestures to disambiguate the object. But when the pointing was reliably valid in the second experiment, the pointing did help participants track the object of the story (e.g the cat or dog).
“Our experiments show that abstract pointing can be used for reference tracking,” Gunter says. “It has direct benefits for understanding on a discourse level. During multimodal communication, gesture input is constantly monitored, evaluated, and prioritized.”
CNS talked with Gunter about the two experiments (including why and how his team decided to design the second experiment differently), the importance of understanding gestures in communication, and what’s next for the research.
CNS: How did you become interested in this research area?
Gunter: Personally, I became interested in this research area when years ago we performed some studies on action understanding and observation. We were using hand postures in this research and due to my interest in language processing, the link with gestures was easily made. We started off with static pictures of meaningful hand-postures – gestures called emblems like the ‘thumbs-up’ sign. In a later phase, we turned to dynamic stimuli and recorded videos in which an actress was uttering sentences while gesturing iconic or beat gestures; she was wearing a black hood to mask lip movements. Because these stimuli are still very restricted from a lot of perspectives, we turned to relatively natural and extended interview situations, as in the new study, where only the face was blurred and abstract pointing was performed.
CNS: Why is it important to understand gestures?
Gunter: Gestures are important to study because they are an integral part of human face-to-face communication. Just look around: Whenever you see people interact, they also gesture. Although the question of whether gestures actually contain useful information is obsolete, it is still important to try to understand what kind of information they transfer, for what purpose, when are they most effective, and how gesture information is integrated into the system we call language. From a neuro-cognitive perspective, gesture research can give us information how meta-linguistic information is fed into the language system, including the relevant brain structures involved.
CNS: What have we known until now about gestures in communication?
Gunter: When gestures are used in a communicative setting, we know for instance that:
- Gesture rate is adapted to the listener, thereby making clear that they have an interactional/communicative function).
- Gesture perception shows interaction with the eye-gaze of the producer.
- Gestures are helpful in noisy environments and for people with hearing problems.
- Gestures transfer information that potentially could add semantic information, disambiguate meaning/syntax/referencing, put words into focus, and refer indirectly or directly to things or agents, etc.
CNS: What new insight were you seeking with this study?
Gunter: Our major goal was to explore whether and how abstract pointing was impacting language processing. As nothing was known from an experimental perspective (there was only one observation study on abstract pointing) at the time we started these experiments, we were interested in whether this type of gesture can indeed be used to track references (‘him/her’, ‘it’, etc.) in a larger context and can be beneficial for the perceiver.
CNS: Why study pointing specifically?
Gunter: Concrete pointing is the first communicative tool an infant has, it is used to refer to objects in space, and we do it all the time. In our study, however, we look at abstract pointing, which has a very remarkable property: You point into space without any object actually being present and still it seems to represent something you refer to. From a developmental perspective, this type of gesture is probably the last being acquired and, although deceptively easy in its form, its function is very complex.
Abstract pointing stands on the basis of a discourse and relates to storytelling. It is associated with the most detailed level of a story and could for instance be used to indicate the different players in a story. This so-called reference tracking is of utmost importance for communication where all persons who are present in a communicative situation need to unambiguously identify the characters talked about.
Compared to other gesture types, abstract pointing operates on a much higher non-local level of abstraction and seem to be able to influence the mental representation of the communicative situation. Thus, although the well-known iconic gestures (i.e. ‘flapping’ with both hands to represent a bird) clearly gives us additional semantic information to speech, this is typically integrated immediately or in a certain time-window of integration within a sentence. The effects of abstract pointing evolve on a much larger timescale across sentences.
CNS: What were you most excited to find?
Gunter: The most exciting finding was that there was a beneficial effect of abstract pointing when gesture use was reliable. It was surprising that when the gesture was unreliable, the EEG data suggested that only incorrect gesture effects are actively processed. For me, it was very exciting to understand what the combined data probably means.
At the moment, we hypothesize that as a default strategy, the cognitive system assumes that gestures are communicative and unless proven different, abstract pointing will be used as a tool for reference tracking in longer discourses. During this multimodal communication, gesture input is constantly monitored and evaluated. In situations where gesture is less clear or plain wrong, they will get less priority or will be discarded completely. Possibly this is a general mechanism and it might be that in multimodal communication, the visual channel is constantly monitored and evaluated for how helpful or effective it is, thereby revealing a mechanism, which prioritizes the relevance of cues in communication.
CNS: You designed a second experiment when you had a hard time to explain results in the first experiment. Can you talk about that a bit?
Gunter: The data of this first experiment suggested that incongruent gestures were affecting processing but that congruent gestures were not of any help. Although intriguing, this finding was against the vast majority of known gesture studies that typically show gestures being meaningful and helpful. What was happening here? Were abstract pointing gestures maybe special in this respect?
After careful consideration, we looked at our completely balanced experimental design out of the perspective of the participant. From their point of view, it became clear after a few trials that, in contrast to what can pragmatically be expected from a communicative situation, the gestures were not a very useful cue for reference-tracking because only half of them gave the correct information. In Experiment 2, we therefore changed the experimental design and presented the participants solely with the baseline and same-referent conditions [where the pointing matched the object being discussed]. In this experiment, we found clear beneficial effects of abstract pointing. I take this as an interesting lesson because the reduced design is from an experimental perspective less optimal/problematic and would have correctly been rejected for publication as a solo experiment. Only in combination with the first experiment it makes sense.
CNS: What’s next for this work?
Gunter: One question relates to how fast the prioritization mechanism works – i.e., after how many invalid gestures are they discarded? How long does it take for a ‘blocked’ system to start using gestures again? Another line of research would explore whether a similar monitoring and prioritization mechanism is present for other communication cues (think of speaker identity cues, eye-gaze, etc.). It would be interesting to know how gesture/communicative cues are ‘fed’ into the left lateralized language system using for instance fMRI
-Lisa M.P. Munoz