Even if no attention was paid, most of us can remember what we ate yesterday, where we ate, when we ate, and even the picture of the meal came to mind. This function is called episodic memory and is one of the cognitive functions unique to higher primates. Researchers believe that episodic memory can be divided into encoding, maintaining (consolidating) and extracting stages. For example, in the above example, "having dinner yesterday" belongs to the coding period, "(today) memories" belongs to the extraction period, and the interval between the two is the retention period.
When discussing the encoding of episodic memory, people are usually driven by the intuitive question: "What kind of coding will lead to a higher extraction success rate?"; On the other hand, due to the high overlap of research content, the research on episodic memory encoding draws on the research results of many visual perception fields, which focus on the present and are not so concerned about the success rate of extraction later.
A very classic theory in the field of visual perception research is that when we look at an object, its feature (What) and location (Where) information is processed through different brain pathways, which researchers call the "What" and "Where" path. These two types of information are then transmitted to the medial temporal lobe, the core brain region of episodic memory, in the peripheral cortex and the parahippocampus, and converge to the hippocampus through the entorhinal cortex. That is to say, after entering the visual cortex through the retina, the feature and location information of the object first “parting ways”, and finally integrated into the hippocampus-this is very important, otherwise we may not be able to accurately match the feature and location information of the object.
However, a study by the Yuji Naya Group (Cerebral Cortex, 2019) found that in addition to the classic feature information, the TE in the "What" pathway and the peripheral cortex in the medial temporal lobe may also process the location information of the participant's current gaze. By detecting the discharge activity of single cells, the Naya group further found that not only the hippocampus, but also the peripheral cortex had significantly more cells that simultaneously represented the feature and location information, but the TE area did not. Further calculations revealed that the response latency of the integration of feature and locations in the peripheral cortex was earlier than the hippocampus. These results suggest that the "What" - peripheral cortex- hippocampus pathway not only transmits both feature and location information, but the integration of the two types of information may occur in the peripheral cortex before the hippocampus.
So why does the brain have more than one unit of integration? In this study by the Naya group, subjects were asked to look at an object and remember its feature and location information. But in daily life, when facing a large and complicated scene, we often need multiple views to reconstruct a complete scene. Therefore, one possibility is that the peripheral cortex in the medial temporal lobe is responsible for integrating feature and location information in a single field of view, while the hippocampus is responsible for the integration across multiple fields of view. The Naya research group is conducting research and demonstration of this conjecture.
This research was published online on August 13, 2019. Chen he, a member of Naya‘s research group and a graduate student of the Peking University-Tsinghua Center for Life Sciences, was the first author of the paper. Researcher Yuji Naya of School of Psychology and Cognitive Science and McGovern Institute of Brain Science at Peking University is the corresponding author. The research was funded by the National Natural Science Foundation of China and the Peking University-Tsinghua Life Science Joint Center. Special thanks to Alex, Donald, and Frank for completing this study.
濒颈苍办:
Reference
Squire LR, Stark CE, Clark RE. 2004. The medial temporal lobe. Annu Rev Neurosci 27:279-306.
Aggelopoulos NC, Rolls ET. 2005. Scene perception: inferior temporal cortex neurons encode the positions of different objects in the scene. Eur J Neurosci 22(11):2903-16.
Chen H, Naya Y. 2019. Forward processing of object-location association from the ventral stream to medial temporal lobe in non-human primates. Cerebral Cortex.