Published in

National Academy of Sciences, Proceedings of the National Academy of Sciences, 51(117), p. 32329-32339, 2020

DOI: 10.1073/pnas.2006752117

Links

Tools

Export citation

Search in Google Scholar

Stable maintenance of multiple representational formats in human visual short-term memory

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Red circle
Preprint: archiving forbidden
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Significance Visual short-term memory (VSTM) is the ability to actively maintain visual information for a short period of time. Classical models posit that VSTM is achieved via persistent firing of neurons in prefrontal cortex. Leveraging the unique spatiotemporal resolution of intracranial EEG recordings and analytical power of deep neural network models in uncovering the neural code of visual processing, our results suggest that visual information is first dynamically extracted in multiple representational formats, including higher-order visual format and abstract semantic format. Both formats are stably maintained across an extended period via coupling to phases of hippocampal low-frequency activity. These results suggest human VSTM is highly dynamic and involves rich and multifaceted representations, which contribute to a mechanistic understanding of VSTM.