A project that will improve the ability of autonomous systems to reason about the impact of their decisions and actions on technical and social requirements and rules has been awarded over £3M from the UKRI Trustworthy Autonomous Systems (TAS) programme.
The TAS Node in Resilience project brings together the disciplines of computer science, engineering, law, mathematics, philosophy and psychology from five UK Universities, to develop a comprehensive toolbox of principles, methods, and systematic approaches for the engineering of resilient autonomous systems and systems of systems.
“Despite remarkable technological advances, current autonomous systems have several limitations that prevent us from safely introducing them at present,” said Dr Radu Calinescu, Reader in the Department of Computer Science at the University of York, and Principal Investigator of the TAS Node in Resilience. “A major constraint is the socio-technical resilience of current autonomous systems. We need to build systems that are technically resilient - able to deal with and adapt their learning in response to unknown and unexpected situations, and that respect our social, legal, ethical, empathy and cultural norms and rules. This project will work across the disciplines to do that.”
The project, which starts on 1 November 2020, will fund a team of 13 investigators and seven post-doctoral researchers across the five universities. Testbeds at each university will validate the foundational research in domains including health and social care, emergency response, and multimodal transportation.
The University of Sheffield brings expertise in analysing and designing safe interactions between systems and humans. Sanja Dogramadzi, Professor of Medical Robotics at Sheffield Robotics said: “Our work will focus on how we develop systems that are resilient to uncertainty and disruption: within the system itself and in its environment. We will also look at how the system cooperates with other systems and with humans to combine capabilities and seek assistance for improved resilience. We will develop demonstrators in collaboration with the University of Sheffield's Advanced Manufacturing Research Centre to facilitate faster implementation and adoption of the project results.”
Mark Levine, Professor of Social Psychology at Lancaster University, will be using his experience in the role of social identities and the effect of robot-human interactions on intergroup relations to ensure that autonomous systems are observant of social, legal, ethical, empathy and cultural norms and rules. “For autonomous systems to be socially trustworthy we must be able to identify the social and cultural rules which a system must adhere to, and be able to specify those as requirements when developing the system,” said Professor Levine.
Professor Bashar Nuseibeh from the Open University said: “To be truly resilient, autonomous systems must be able to adapt and evolve as faults, failures and uncertainties become apparent. Our experience in adaptive systems and requirements engineering will enable us to develop systems that can handle unexpected disruptions.”
“Human factors research studies how people interact with systems, other people, and their environment,” said Professor Neville Stanton of the University of Southampton. “Crucial to developing trustworthy autonomous systems is understanding how systems and humans interact in order to increase socio-technical resilience through cooperation.”
The project has two stages; first looking at how autonomous systems can be developed to be more resilient individually, and secondly looking at the socio-technical resilience of autonomous systems of systems. For example, an autonomous system of systems may support the end-to-end patient journey for a person requiring emergency assistance from a first responder, followed by admittance, care and discharge from hospital, and long-term care at home.
This project is part of the UKRI Trustworthy Autonomous Systems (TAS) programme, funded through the UKRI Strategic Priorities Fund and delivered by the Engineering and Physical Sciences Research Council (EPSRC). The TAS programme brings together the research communities and key stakeholders to drive forward cross-disciplinary fundamental research to ensure that autonomous systems are safe, reliable, resilient, ethical and trusted.