The internal maps of insects

Webb, Barbara. “The internal maps of insects.” Journal of Experimental Biology 222, no. Suppl 1 (2019): jeb188094.

ABSTRACT

Insect navigation is strikingly geometric. Many species use path integration to maintain an accurate estimate of their distance and direction (a vector) to their nest and can store the vector information for multiple salient locations in the world, such as food sources, in a common coordinate system. Insects can also use remembered views of the terrain around salient locations or along travelled routes to guide return, which is a fundamentally geometric process.

Recent modelling of these abilities shows convergence on a small set of algorithms and assumptions that appear sufficient to account for a wide range of behavioural data. Notably, this ‘base model’ does not include any significant topological knowledge: the insect does not need to recover the information (implicit in their vector memory) about the relationships between salient places; nor to maintain any connectedness or ordering information between view memories; nor to form any associations between views and vectors. However, there remains some experimental evidence not fully explained by this base model that may point towards the existence of a more complex or integrated mental map in insects.

 

Some key questions in this paper. (These content are extracted from Webb, Barbara 2019.

Box 1. Comparing insect and robot navigation

It is useful to compare the insect base model presented here with some key issues in robot navigation, as articulated, for example, in Milford and Schulz (2014).

Error accumulation means pure odometry is not viable for any interesting travel range or task

For insects, it is plausible that their odometry is sufficiently accurate in normal foraging conditions that they can depend on it to get near enough to their goal for local mechanisms (e.g. visual memory or an olfactory plume cue) to guide the final approach.

Odometry needs to be corrected by recognition of landmarks

This is the core principle of simultaneous localisation and mapping (SLAM), that simultaneous updating of the robot’s own pose relative to landmarks and the geometric layout of the landmarks will converge to an accurate map. The base model assumes that odometry is only reset when home is visited, and arriving at a familiar place is not used to reduce the accumulated error, as no vector information is stored with a view.

How to encode large environments?

The suggested answer is that the insect encodes it as a set of vectors with a common origin at the nest, and that it only ever has at most one vector memory active [along with path integration (PI)] to determine its current movement, although it might switch between vectors without returning to the nest. The effective extent of the environment is thus bounded mostly by PI accuracy.

How are visual locations/landmarks recognised from different viewpoints?

Visual locations are not recognised, but only capable of evoking a stronger or weaker sense of familiarity. Moreover, there is no viewpoint invariance; in fact, the whole principle of the function of the view memory guidance system as proposed is to have the animal experience familiarity only when it adopts the same viewpoint, thus informing it that it is now facing the goal.

 

Box 2. Some key open questions for insect navigation

How do insects deal with 3D motion and the disturbances to both celestial and terrestrial views caused by pitch and roll of their heads?

How do insects obtain sufficiently accurate speed information for PI from the potentially very noisy inputs of optic flow and step-counting?

How do view memories remain robust under changing light conditions?

How do insects manage to steer a course along a vector direction while facing a different direction, e.g. ants dragging food backward, or bees side-slipping in flight?

Do learning walks and flights have structure consistent with the assumed function (in the base model) of acquiring views from multiple directions towards the nest, and might they serve some additional function such as rehearsing return paths?

What is the physiological basis of the reliable integration memory needed for PI, and the one-shot learning needed for vector and view memories?

What is the physiological basis of the interaction of views and vectors, in particular their weighted combination in behaviour?

Are units in the central complex directly analogous to mammalian head direction cells (Taube, 1998)? Is it possible that view memories resemble place cells (O’Keefe, 1979)? Can we find a connection between the PI mechanisms of insects and the grid cells found in mammals (Moser et al., 2008; see Gaussier, et al. 2019)?

For further info, please read the paper Webb, Barbara 2019.

Webb, Barbara. “The internal maps of insects.” Journal of Experimental Biology 222, no. Suppl 1 (2019): jeb188094.

More info can be found Webb’s website http://homepages.inf.ed.ac.uk/bwebb/