Dr Michael Mangan
MEng, MSc, PhD
School of Computer Science
Senior Lecturer in Machine Learning and Robotics
Operations Director of Sheffield Robotics
Member of the Machine Learning research group
+44 114 222 1905
Full contact details
School of Computer Science
Regent Court (DCS)
211 Portobello
Sheffield
S1 4DP
- Profile
-
Michael Mangan joined the Department of Computer Science at the University of Sheffield in April 2018. Before joining he was a Senior Lecturer in Computer Science in College of Science and member of the Lincoln Centre for Autonomous Systems (L-CAS).
He received his undergraduate degree in Avionics (MEng) from the University of Glasgow in 2004, before moving to the University of Edinburgh where he completed an MSc in Neuroinformatics (2006) and a PhD in biorobotics (2011).
He remained at the University of Edinburgh for the next 4 years receiving funding awards from the BBSRC and EPSRC. He joined the Lincoln Centre for Autonomous Systems in April, 2016.
- Research interests
-
Dr Mangan's primary research focuses on modelling the navigational behaviour of insects, which are able to travel through complex environments robustly despite their limited nervous and sensory systems. Revealing the sensory and algorithmic underpinning of these capabilities will not only aid the understanding of biological systems but may also offer engineering solutions - building robots able to navigate as well as the humble ant.
- Publications
-
Journal articles
- Neuromorphic sequence learning with an event camera on routes through vegetation. Science Robotics, 8(82).
- CATER: Combined Animal Tracking & Environment Reconstruction.. Science Advances, 9(16).
- A virtuous cycle between invertebrate and robotics research: perspective on a decade of Living Machines research. Bioinspiration & Biomimetics.
- CompoundRay, an open-source tool for high-speed and high-fidelity rendering of compound eyes. eLife, 11.
- EchoVPR: Echo State Networks for Visual Place Recognition. IEEE Robotics and Automation Letters, 1-1.
- How the insect central complex could coordinate multimodal navigation. eLife, 10.
- View this article in WRRO EchoVPR: Echo State Networks for Visual Place Recognition.
- EchoVPR: Echo State Networks for Visual Place Recognition.. CoRR, abs/2110.05572.
- The use of light spectrum blocking films to reduce populations of Drosophila suzukii Matsumura in fruit crops. Scientific Reports.
- Route-following ants respond to alterations of the view sequence. The Journal of Experimental Biology, 223(14). View this article in WRRO
- A decentralised neural model explaining optimal integration of navigational strategies in insects. eLife, 9. View this article in WRRO
- Multimodal interactions in insect navigation. Animal Cognition. View this article in WRRO
- Towards image-based animal tracking in natural environments using a freely moving camera. Journal of Neuroscience Methods, 330. View this article in WRRO
- L*a*b*fruits : a rapid and robust outdoor fruit detection system combining bio-inspired features with one-stage deep learning networks. Sensors, 20(1). View this article in WRRO
- Rotation invariant visual processing for spatial memory in insects. Interface Focus, 8(4), 20180010-20180010. View this article in WRRO
- Software to convert terrestrial LiDAR scans of natural environments into photorealistic meshes. Environmental Modelling & Software, 99, 88-100.
- How Ants Use Vision When Homing Backward. Current Biology, 27(3), 401-407. View this article in WRRO
- Ant homing ability is not diminished when traveling backwards. Frontiers in Behavioral Neuroscience, 10(1). View this article in WRRO
- Using an Insect Mushroom Body Circuit to Encode Route Memory in Complex Natural Environments. PLoS Computational Biology, 12(2), e1004683-e1004683. View this article in WRRO
- Optimal cue integration in ants. Proceedings of the Royal Society B: Biological Sciences, 282(1816).
- How variation in head pitch could affect image matching algorithms for ant navigation. Journal of Comparative Physiology A, 201(6), 585-597. View this article in WRRO
- Insect navigation: do ants live in the now?. Journal of Experimental Biology, 218(6), 819-823.
- Still no convincing evidence for cognitive map use by honeybees. Proceedings of the National Academy of Sciences, 111(42), E4396-E4397.
- Snapshots in ants? New interpretations of paradigmatic experiments. Journal of Experimental Biology, 216(10), 1766-1770.
- Spontaneous formation of multiple routes in individual desert ants (Cataglyphis velox). Behavioral Ecology, 23(5), 944-954.
- Modelling place memory in crickets. Biological Cybernetics, 101(4), 307-323.
- Path Integration Using a Model of e-Vector Orientation Coding in the Insect Brain: Reply to Vickerstaff and Di Paolo. Adaptive Behavior, 16(4), 277-280.
- Regarding Compass Response Functions For Modeling Path Integration: Comment on “Evolving a Neural Model of Insect Path Integration”. Adaptive Behavior, 16(4), 275-276.
- Place memory in crickets. Proceedings of the Royal Society B: Biological Sciences, 275(1637), 915-921.
- Evolving a Neural Model of Insect Path Integration. Adaptive Behavior, 15(3), 273-287.
Chapters
- Robust Counting of Soft Fruit Through Occlusions with Re-identification, Lecture Notes in Computer Science (pp. 211-222). Springer International Publishing
- Non-destructive Soft Fruit Mass and Volume Estimation for Phenotyping in Horticulture, Lecture Notes in Computer Science (pp. 223-233). Springer International Publishing
- Spatio-Temporal Memory for Navigation in a Mushroom Body Model, Biomimetic and Biohybrid Systems (pp. 415-426). Springer International Publishing
- Preface (pp. V-VII).
- Biomimetic and Biohybrid Systems Springer International Publishing
Conference proceedings papers
- An Analysis of a Ring Attractor Model for Cue Integration (pp 459-470) View this article in WRRO
- Visual tracking of small animals in cluttered natural environments using a freely moving camera. 2017 IEEE International Conference on Computer Vision Workshops (pp 2840-2849). Venice, Italy, 22 October 2017 - 29 October 2017. View this article in WRRO
- Using the Robot Operating System for Biomimetic Research (pp 515-521)
- Route Following Without Scanning (pp 199-210)
- Sky segmentation with ultraviolet images can be used for navigation. Robotics: Science and Systems X, 12 July 2014 - 16 July 2014.
- How Active Vision Facilitates Familiarity-Based Homing (pp 427-430)
- Feasibility Study of In-Field Phenotypic Trait Extraction for Robotic Soft-Fruit Operations. UKRAS20 Conference: “Robots into the real world” Proceedings
- Towards Insect Inspired Visual Sensors for Robots. UKRAS20 Conference: “Robots into the real world” Proceedings View this article in WRRO
Preprints
- I2Bot: an open-source tool for multi-modal and embodied simulation of insect navigation, Cold Spring Harbor Laboratory.
- CompoundRay: An open-source tool for high-speed and high-fidelity rendering of compound eyes, Cold Spring Harbor Laboratory.
- View this article in WRRO From skylight input to behavioural output: a computational model of the insect polarised light compass.
- How the insect central complex could coordinate multimodal navigation, Cold Spring Harbor Laboratory.
- Spatio-temporal Memory for Navigation in a Mushroom Body Model, Cold Spring Harbor Laboratory.
- A Decentralised Neural Model Explaining Optimal Integration of Navigational Strategies in Insects. View this article in WRRO
- Place recognition with event-based cameras and a neural implementation of SeqSLAM.
- Grants
-
Current research grants
- ActiveAI - active learning and selective attention for robust, transparent and efficient AI, EPSRC, 11/2019 - 10/2024, £953,584, as Co-PI
Previous research grants
- SkyEye: Feasibility Study of Vision Based Localisation in GPS-compromised Environments, EPSRC, 10/2020 - 03/2022, £24,993, as PI
- Brains on Board: Neuromorphic Control of Flying Robots, EPSRC, 12/2016 - 06/2022, £2,128,934, as Co-PI
- Exploiting invisible cues for robot navigation in complex natural environments, EPSRC, 02/2015 - 08/2018, £558,417, as Researcher Co-PI
- Professional activities and memberships
-
Member of the Machine Learning research group and Sheffield Robotics