Schedule of Events | Symposia

Attention is required for visual modulation of early auditory speech processing

Poster Session F - Tuesday, April 1, 2025, 8:00 – 10:00 am EDT, Back Bay Ballroom/Republic Ballroom

Yun ZOU1 (yunzou@umass.edu), Lisa D. Sanders1, Alexandra Jesse1; 1University of Massachusetts Amherst

Most daily conversations occur face-to-face, when listeners hear and see the talker. Visual information from talkers’ faces can significantly enhance speech recognition, particularly when auditory signals are degraded. However, whether attentional resources are necessary for visual speech gestures to influence auditory speech processing remains unclear. Some theories posit that these effects are automatic; others allow that, minimally, listeners must attend to speech to observe visual effects on early auditory processing. Using event-related potentials (ERPs), we investigated the role of attention in visual speech effects and whether this influence is modulated by auditory signal ambiguity. Participants heard speech sounds that were either unambiguous ([p] or [t]) or ambiguous (between [p] and [t]), accompanied by unrelated tone sequences. In some trials, participants also saw the speaker’s face, while in others, they only heard the speaker. In half of the blocks (attend-speech blocks), participants categorized the speech sounds as /apa/ or /ata/ while ignoring the tones. In the other half (attend-tone blocks), they categorized the tones as high or low while ignoring the speech sounds. ERP results revealed that when participants attended to the speech sounds, seeing visual gestures reduced N1 responses in the audiovisual condition compared to the auditory-only condition for both ambiguous and unambiguous sounds. However, this early audiovisual facilitation disappeared when attention was directed to the tones, regardless of auditory ambiguity. Attention to speech is thus necessary for early audiovisual interaction during speech processing and that attentional effects on visual influence are consistent for both ambiguous and unambiguous auditory signals.

Topic Area: PERCEPTION & ACTION: Multisensory

CNS Account Login

CNS2025-Logo_FNL_HZ-150_REV

March 29–April 1  |  2025

Latest from Twitter