A smile, a wave, a head nod – seemingly small communicative gestures are of vital importance even for babies. And new research finds that the brain processes these social cues faster than previously thought, as quickly as 70 milliseconds. The finding gives insight into the brain’s priorities that may further understanding of both typical and atypical social interaction.
“Detection of social-communicative cues seems to be of sufficiently high importance that the brain prioritizes the processing of these cues to ensure that we respond quickly,” says Elizabeth Redcay of the University of Maryland. While previous research had shown rapid processing for emotional stimuli compared to neutral ones, the new study was the first to look at “self-relevant” communicative cues. “We’re referring to human actions that not only convey meaning but also signal an intent to communicate, like waving hello or come here, pointing, or holding up your hand to indicate ‘stop,’”
Developmental and clinical psychologists have long established that social interactions are vital to learning and shaping our environments. And atypical engagements in social interactions – as seen in autism – can be detrimental to learning.
“Given the importance of these interactions, I think it’s crucial that we understand how the brain detects, processes, and motivates engagement in these social interactions,” Redcay says. “The goal of my research program is to elucidate the neural and cognitive components of social interaction – within typical and atypical development – and this project represents one step in that direction.”
Using a whole-brain magnetoencephalography (MEG) method, Redcay and colleague Tom Carlson, now at Macquarie University in Australia, wanted to pin down the timing of when the brain discriminates social-communicative cues. The MEG records with millisecond accuracy magnetic currents from all over the scalp that are generated by large populations of neurons firing within the brain.
While in the MEG chamber, participants viewed images of a woman performing various actions. Some gestures were communicative and others simply indicated grooming behaviors (such as smoothing her hair). Prior to the MEG recording, the researchers developed rankings for the gestures’ self-relevance and emotional content.
Self relevance was rated based on how much the participants felt the woman wanted to communicate with them: Highly self-relevant cues included the actress extending her hand for a handshake or applause, for example, while minimally self-relevant cues included grooming gestures, such as stretching or meaningless hand signs. For emotional content, a gesture such as applause would convey positive emotion, while, tsk-tsk conveys a negative message.
As published earlier this month in Social Cognitive and Affective Neuroscience, the researchers found that the participants discriminated emotional cues as quickly as 70 milliseconds and self-relevant cues as quickly as 100 milliseconds. “The fact that these gestures could be discriminated by the brain was not surprising but we were surprised – and excited – to see how early this discrimination occurred, even when controlling for visual differences,” Redcay says.
Previously, some studies looking at the communicative cue of someone looking at another person versus away revealed discrimination 100 to 200 milliseconds later than the timing found in this new study. “We think some reasons for the earlier discrimination in our study may be that we used whole body gestures that provided a greater naturalistic context and the use of whole-brain decoding methods that provide greater spatial and temporal sensitivity,” Redcay explains.
Understanding the exact timing of cue processing is important in exploring how people socially communicate. The very quick processing rates observed suggest that the cues do not require lengthy, recurrent feedback processing. “This fits with theories suggesting that social communicative cues are prioritized by humans from the first days of life onwards,” Redcay says.
The overarching goal, she says, is to help understand cases in which these cues may not be as strongly prioritized, such as in people with autism. Redcay and her colleagues are currently running fMRI and behavioral studies in adults and children to test how sensitive they are to “communicative context” – everything from a simple gaze to the gesture cues looked at in this new study to actual real-time interactions with a social partner.
-Lisa M.P. Munoz