Schedule of Events | Symposia

Symposium Session 7 - Interactions between the brain's visual and memory systems: recent advances and new perspectives

Symposium Session 7: Monday, March 31, 2025, 10:00 am – 12:00 pm EDT, Constitution A

Chairs: Adam Steel1, Serra Favila2; 1University of Illinois Urbana-Champaign, 2Brown University
Presenters: Brett Foster, Serra Favila, Adam Steel, Biyu He

Understanding how the human brain integrates externally- and internally-oriented information is a central goal of cognitive neuroscience. Yet, conventional studies of brain organization often separately consider how external information is represented in sensory systems and internal information is represented in memory systems. As a result, current models of brain function fail to account for complex natural behaviors that require dynamic integration of perceptual and mnemonic representations, like choosing where to direct our gaze as we turn our heads or searching for a familiar face in a crowd. This symposium will highlight new advances in understanding how the brain’s vision and memory systems interact, and the perceptual and behavioral consequences of these interactions. Across four talks, we will highlight findings from the human visual cortex, default mode network, and hippocampus, addressing several open questions at the interface of perception and memory: 1) How are cortical perceptual and mnemonic representations integrated in space and time? (Foster), 2) How is feature coding in visual cortex similar and distinct during perception and memory, and how is this shaped by cognitive factors? (Favila); 3) How does complementary, opponent coding between visual and memory systems balance and integrate internal cognition with external perception? (Steel); and 4) How does ongoing activity within the default network and visual cortex shape conscious perception? (He). Collectively, this work advances a new framework for understanding how visual and memory systems dynamically exchange information to support complex behaviors.

Presentations

Perceiving the past – how do vision and memory work together?

Brett Foster1; 1University of Pennsylvania

Episodic memory enables reexperiencing of the past without the original sensory materials constituting a specific item or event. During retrieval, specialized visual regions are thought to be engaged to reinstate sensory details, while specialized memory regions integrate these sensory data to reconstruct past events. This emphasizes how memory processes shape activity within visual brain regions, and conversely, how the coding properties of these sensory regions influence the types of features integrated within mnemonic regions. This talk will present data from human intracranial recordings and neuroimaging, focusing on the interplay between visual and memory systems. Specifically, I will first present data on how category-selective regions within human ventral temporal cortex (VTC) are shaped by and support the retrieval of prior perceptual experiences. Next, I will discuss how these category-selective responses in VTC potentially influence the features and functional organization of the medial parietal cortex (MPC) and its role in retrieval and the construction of past events. Together, these studies provide some insight into the spatio-temporal transformations in neural activity that occur when visual systems are engaged in the service of memory behavior. The close interaction between these systems raises new theoretical challenges, particularly in understanding how highly specialized visual systems, shaped through development, maintain functional integrity while accommodating plasticity to support long-term memory consolidation.

Transformations between perceptual and mnemonic activity in the human visual system

Serra Favila1; 1Brown University

When we remember an event, we often bring to mind the same sensations and thoughts we experienced initially. This phenomenon has a clear parallel in the brain, where sensory regions that were active during perception are reactivated during recall. While important, the focus on neural similarities between perception and recall has potentially overshadowed important differences between these states. In this talk, I will present human neuroimaging data that characterizes mnemonic activity in the human visual system, its correspondence to perceptual activity, and how it is shaped by cognitive factors. First, using a visual encoding model to quantify spatial reactivation, I will show that spatial responses in visual cortex are markedly different during recall compared to perception, even for extremely well-trained memories. Simulations and modeling work suggest that this change cannot be attributed to memory failure and is instead a constraint imposed by the hierarchical architecture of the visual system. Second, using a task in which multiple memories compete to guide behavior, I will show that hippocampal mechanisms for separating memories have downstream consequences on mnemonic activity in visual cortex. Together, these studies suggest that mnemonic activity is subject to a different set of fundamental constraints than feedforward activity in the visual system, and that cognitive factors such as memory competition further transform mnemonic representations. These findings advance classic theories of memory reactivation and set the stage for developing models that more fully specify how brain activity is transformed from perception to recall.

Retinotopic coding is a ubiquitous scaffold organizing the brain’s internal and external information processing

Adam Steel1; 1University of Illinois Urbana-Champaign

How does the human brain seamlessly integrate internally-oriented (mnemonic) and externally-oriented (perceptual) information? This question has long puzzled neuroscientists, given the traditional view that internally-oriented networks like the default network (DN) and externally-oriented networks like the dorsal attention network (dATN) are globally competitive and implement distinct neural codes. Our research challenges this perspective, revealing a surprising role for the brain’s foundational external visuospatial code, retinotopic coding, in structuring interactions between these internally- and externally-oriented networks. Using advanced neuroimaging techniques, I show that retinotopic coding extends beyond visual areas into higher-order brain regions, including the DN and dATN. Moreover, this retinotopic code identified during visual tasks scaffolds the spontaneous opponent interaction between these networks during resting-state fMRI. Finally, the retinotopic code integrates with the domain-specific preferences of subregions within these networks, enabling efficient, parallel processing of retinotopic and categorical information during resting-state fMRI, as well as during visual perception and memory tasks. These findings suggest that retinotopic coding serves as a fundamental organizing principle for brain-wide communication that structures the dynamic interaction between perceptual and mnemonic systems. This work offers a new framework for understanding how the brain balances and integrates internal cognition with external perception, with implications for our understanding of attention, perception, and memory processes.

How ongoing spontaneous brain activity influences conscious visual perception

Biyu He1; 1New York University

Preexisting brain states wield powerful influences on conscious perception. Depending on the preexisting brain state at the time of stimulus arrival, a physically identical stimulus may be consciously perceived or not, a visual object may be consciously recognized or not, and we may perceive something that is not out there. Preexisting brain states include both anatomical connections shaped by past experiences and the moment-to-moment fluctuations in spontaneous brain activity. In this talk, I will discuss our recent work investigating the role of spontaneous activity in shaping conscious visual perception in humans. Employing 7 Tesla fMRI and a threshold-level visual perception task, we observed that prestimulus activity originating from distributed brain regions, including visual cortices and regions of the default-mode and cingulo-opercular networks, exerts a diverse set of effects on the sensitivity and criterion of conscious recognition, and categorization performance. We further elucidated the mechanisms underlying these behavioral effects, revealing how prestimulus activity modulates multiple aspects of stimulus processing. In addition, I will discuss our work using MEG to dissect the electrophysiological and frequency-domain signatures of perceptually relevant spontaneous activity.

CNS Account Login

CNS2025-Logo_FNL_HZ-150_REV

March 29–April 1  |  2025

Latest from Twitter