Neuro-Autonomy: Neuroscience-Inspired Perception, Navigation, and Spatial Awareness for Autonomous Robots

MURI Project Title: “Neuro-Autonomy: Neuroscience-Inspired Perception, Navigation, and Spatial Awareness for Autonomous Robots”

Project Website: http://sites.bu.edu/neuroautonomy/

https://electrical.eng.unimelb.edu.au/control-signal-processing/neuro-autonomy 

 

The following content is extracted from the http://www.bu.edu

A Boston University-led research team was selected to receive a $7.5 million Multidisciplinary University Research Initiative (MURI) grant from the U.S. Department of Defense (DoD).  With this prestigious grant, the researchers will develop a novel category of neuro-inspired autonomous robots for land, sea, and air that the investigators have termed “neuro-autonomous.”

Pic from http://sites.bu.edu/neuroautonomy/

The initiative will tackle the challenge of how to make robots truly autonomous as well as provide important insights into the process of learning and memory formation. The project will be led by Yannis Paschalidis, director of the Center for Information and Systems Engineering, and professor in the College of Engineering at Boston University.

The winning team of researchers is a Dream Team of experts from Boston University and the Massachusetts Institute of Technology. It includes experts in neuroscience, robotics, computer science, computer vision, artificial intelligence, mathematical systems theory, and a host of other related advanced technology domains. The project will also benefit from collaboration with renowned researchers from the University of Melbourne, Macquarie University, Queensland University of Technology, and the University of New South Wales. 

Pic @Yannis Paschalidis

This project aims at developing next-generation AVs, capable of learning and on-the-fly adaptation to environmental novelty. These systems need to be orders of magnitude more energy efficient than current systems and able to pursue complex goals in highly dynamic and even adversarial environments.

Biological organisms exhibit the capabilities envisioned for next-generation AVs. From insects to birds, rodents and humans, one can observe the fusing of multiple sensor modalities, spatial awareness, and spatial memory, all functioning together as a suite of perceptual modalities that enable navigation in unstructured and complex environments. With this motivation, the project will leverage deep neurophysiological insights from the living world to develop new neuroscience-inspired methods capable of achieving advanced, next-generation perception and navigation for AVs.

For more detail info, please visit the BU website

BU-led Research Team Wins Competitive $7.5 million MURI Grant to Create Neuro-Autonomous Robots

MURI Project Abstract, Approved for Public Release

How to Make Self-Driving Vehicles Smarter, Bolder

Neuro-Autonomous Robots

Aussie and US uni collaboration to work on next-gen autonomous systems