- New ‘decade in the making’ low-cost tracking technology reveals remarkable speed at which desert ants memorise homeward journeys
- Desert ants previously tracked using pen and paper or GPS, leaving gaps in knowledge around behaviour
- The software works across animal types and it’s low-cost means it could be used by citizen scientists
- The more complete picture it builds could inspire the next generation of bioinspired robots
Groundbreaking tracking technology that has revealed new insights into how desert ants navigate their complex worlds could inspire the next generation of smart, efficient robots.
An international research collaboration involving the University of Sheffield has developed new tracking technology which uses computer vision - a field of computer science that programs computers to interpret and understand images and videos - to track individual desert ants over their entire foraging lives. The tool documents an ant’s journey from when it first leaves its nest until it finds a food site and returns to its colony.
Their new dataset has revealed that the ants learn incredibly quickly - memorising their homeward paths after just one successful trip. But intriguingly, their outward routes evolved over time indicating different strategies for exploration versus exploitation. The high precision data also revealed an underlying oscillatory movement that is invisible to the human eye, which can explain how ants generate complex search patterns suited to the current conditions.
As the new software works across animal types, and uses video captured using standard cameras, it is already being adopted by numerous international research groups, and is ideally suited to citizen science projects. The high-precision data gathered is crucial to understanding how brains can guide animals through their complex world, which could inspire a new generation of bioinspired robots.
The new technology and dataset - produced by Dr Michael Mangan, a Senior Lecturer in Machine Learning and Robotics at the University’s Department of Computer Science with Lars Haalck and Benjamin Risse of the University of Münster, Antoine Wystrach and Leo Clement of the Centre for Integrative Biology of Toulouse and Barabara Webb of the University of Edinburgh - is demonstrated in a new study published in the Science Advances journal.
The study describes how CATER (Combined Animal Tracking & Environment Reconstruction) uses artificial intelligence and computer vision to track the position of an insect in video captured using off-the-shelf cameras. The system can even detect tiny objects difficult to see by eye, and is robust to background clutter, obstructions and shadows allowing it to function in the animal’s natural habitat where other systems fail.
Dr Michael Mangan, Senior Lecturer in Machine Learning and Robotics at the University of Sheffield, said: “We captured this data during a summer field trip, but it has taken 10 years to build a system capable of extracting the data, so you could say it’s been a decade in the making.
“I’ve always been fascinated by how these insects can navigate long distances - up to 1km - in such forbidding landscapes where temperatures are over 50 degrees celsius.
“Up until now, desert ants have been tracked by hand using pen and paper, which involves creating a grid on the ground with string and stakes and monitoring their behaviour within the grid. Another method used to get around this is by using a Differential Global Positioning System (GPS) - but the equipment is expensive and low precision.
“The lack of a low-cost, robust way to capture precise insect paths in the field has led to gaps in our knowledge about desert ant behaviour. Specifically about how they learn visual routes, how quickly they do so, and how strategies they employ that might simplify the task.”
CATER’s new visual tracking method addresses these challenges by capturing high resolution footage of ants in their natural environment and using imaging technology to identify individual ants based on motion alone. A novel image mosaicing technique is then used to reconstruct, or stitch together, the landscape from the high resolution imagery. This new approach bridges the gap between field and laboratory studies, providing unique insights into the navigational behaviour of ants. Such data will be crucial in revealing how animals with a brain smaller than a pinhead navigate their complex environments so effectively.
Such insights are already being turned into commercial products by pioneering University of Sheffield spin-out company Opteran, who are reverse engineering insect brains to produce highly robust autonomy using low cost sensors and computing.
Dr Mangan said: “Desert ants are the ideal inspiration for next generation robots - they navigate over long distances, through harsh environments, and don’t rely on pheromone trails like other ants, or GPS and 5G like current robots.
“We hope that our tool will allow us to build a more complete picture of how insects learn to pilot through their habitats, bringing new scientific knowledge and informing engineers about how they could build similarly capable artificial systems.”
The study, CATER: Combined Animal Tracking & Environment Reconstruction, is published in Science Advances. Read the paper.
The University of Sheffield's Department of Computer Science is now recruiting for a PhD project on Using computer vision and machine learning to revolutionise our understanding of insect behaviour to further work in this field.
Contact
For further information please contact: