How an allocentric representation of surrounding visual space can be constructed and stored by the dorsal visual path?

Li, Tianyi, Angelo Arleo, and Denis Sheynikhovich. “A model of a panoramic visual representation in the dorsal visual pathway: the case of spatial reorientation and memory-based search.” bioRxiv (2019): 827667.

Abstract
Primates are primarily visual animals and understanding how visual information is processed on its way to memory structures is crucial to the understanding of how memory-based visuospatial behavior is generated. Recent imaging data demonstrate the existence of scene-sensitive areas in the dorsal visual path that are likely to combine visual information from successive egocentric views, while behavioral evidence indicates the memory of surrounding visual space in extraretinal coordinates. The present work focuses on the computational nature of a panoramic representation that is proposed to link visual and mnemonic functions during natural behavior. In a spiking artificial neuron network model of the dorsal visual path it is shown how time-integration of spatial views can give rise to such a representation and how it can subsequently be used to perform memory-based spatial reorientation and visual search. More generally, the model predicts a common role of view-based allocentric memory storage in spatial and not-spatial mnemonic behaviors.”

Li, Tianyi, Angelo Arleo, and Denis Sheynikhovich. “A model of a panoramic visual representation in the dorsal visual pathway: the case of spatial reorientation and memory-based search.” bioRxiv (2019): 827667.