Figuring out where things are is an important task that many robots need to know how to do. In applications from health and emergency response to military and even entertainment, it's a pretty basic requirement.
"However, our robots are still far behind the accuracy, efficiency and adaptability of the algorithms which exist in biological systems,” says Zahra Bagheri, a PhD student at the University of Adelaide, who's working on developing new ways of seeing.
She's part of a team of engineers and neuroscientists using the way insects visualise and hunt prey to improve autonomous robotic technology.
"Our aimed to discover if the behaviour and neuronal mechanisms that underlie an insect’s target detection and selection could provide a blueprint for a robot to perform similar tasks autonomously,” said Steven Wiederman, who's leading the project.
"Insects are capable of remarkably complex behaviour, yet have a miniature brain consuming tiny amounts of power compared with even the most efficient digital processors."
As part of the research, the team used recordings from specific neurons in the brain of a dragonfly to develop a target detection and tracking algorithm. They they built that algorithm into a robot's brain.
"This is the first time that a target tracking model inspired by insect neurophysiology has been implemented on an autonomous robot and tested under real-world conditions," said Wiederman.
In tests, which included low-contrast targets, heavily cluttered environments and the presence of distractors, the robot performed well – suggesting that simple processors can be used to deliver capable results.
"We uncovered insight into how insects’ neuronal systems may handle varying challenges during target tracking and pursuit," said Bagheri.