How to perform robot place recognition with multi-scale, multi-sensor system inspired by place cells?

Adam Jacobson, Zetao Chen, Michael Milford. Leveraging variable sensor spatial acuity with a homogeneous, multi-scale place recognition framework. Biological Cybernetics, Jan 20, 2018, https://doi.org/10.1007/s00422-017-0745-7 .

This paper presented a biologically inspired multi-scale, multi-sensor place recognition system that incorporates the varying spatial localization estimates provided by different sensing modalities to improve overall place recognition performance. They focused on developing a model of place cells discovered within the rodent hippocampus, incorporating the multi-scale, multi-sensory nature of the place cells into our robotic localization model.

Keywords: Place recognition, Place Cells, Sensor fusion, Robotics

Some main contributions as follows:

  • The researchers develop a localization system for integrating commodity sensors, such as camera, Wi-Fi and barometric sensors, which naturally produce localization results of different spatial resolution with a multi-scale homogeneous mapping framework.
  • The proposed framework encodes the sensory data associated with places at several different scales and then performs recognition over these multiple scales before combining the individual scale place match hypotheses to form a global place match hypothesis.
  • They provide an analysis of the multi-scale framework, capturing each sensor’s contribution to place estimation and evaluating each sensor’s optimal operating scale.

Fig. 1 Illustration of multi-scale, multi-sensor fusion for place recognition within a multi-story building.

Fig. 2 Example of the spatially specific place cells firing as a rodent runs down a linear track.

A framework to combine sensors with variable place resolutions.

For further more info, please read the paper.