- Digital twin technology will transform the manufacturing sector
- Collaboration between humans and robots in the workplace is growing in popularity, but safety concerns remain a critical barrier
- Research led by Sheffield presents a novel digital twinning framework to address safety issues, preventing failures, improving performance and saving costs
- The framework simplifies the design, deployment and testing of robotic systems, whilst supporting safety and security
Industry 4.0 technologies such as digital twins will transform the manufacturing sector, with the market estimated to reach $5.6bn by 2027.
A digital twin is a computerised copy of a physical asset; data flows between the two in real time, enabling information in each to influence and control the other. This enables us to use powerful digital tools to monitor and control the physical system, and model elements of the system that might not otherwise be observable.
Bridging the physical and virtual worlds will provide increased access to the data required to deliver smart industrial manufacturing solutions, as well as a means to deal with safety and security concerns.
A paper published in ‘frontiers in Robotics and AI’ by the Universities of Sheffield, York and Bremen presents a novel, modular digital twinning framework developed for the investigation of safety within collaborative robotic manufacturing processes.
The research, led by Sheffield, raises exciting possibilities for the use of digital twins in robotic safety assurance, using digital twins to design, test, deploy, monitor and control real-world robotic processes for collaborative robots.
Collaborative robots are a type of robot intended to operate alongside humans in shared workspaces, and enable the benefits of both manual and automated ways of working to be employed.
This means that we can remove the physical barriers that segregate humans and robots in typical manufacturing environments; a human can now work alongside the robot, and even interact with it physically. For example, a robot may undertake a hazardous operation, like welding, before handing a completed component directly over to a member of staff for further assembly. Alternatively, the robot could hold a workpiece during the manufacturing process, positioning and rotating it to act as an extra pair of hands.
Until recently, digital twins have typically been operated as closed systems, limited to representing a single system or process. Modular frameworks, as illustrated in the paper, enable the connection of multiple individual digital twins (for example representing different robots, sensors, and machines) into one fully connected digital system. This system can then be operated in multiple forms - offline simulation for initial process testing, online process monitoring and control, and via immersive virtual visualisations for training and remote system operation.
Dr Jonathan Aitken, Senior University Teacher in Robotics in Sheffield’s Department of Automatic Control and System Engineering said: “in robotics and manufacturing, the certification of safety-critical systems is regulated by standards. As systems become more complex, this becomes increasingly difficult to do by hand. Digital twins provide a unique opportunity to generate evidence to support a safety case and the verification of a systems’ compliance to the given standards.”
The ability to digitally replicate and analyse an entire manufacturing process is vital for identifying potential challenge points. It can provide information on potential operational failures and help to prevent unplanned downtime, save costs, improve product performance and help solve challenges across the supply chain.
Dr Aitken added: “This opens up exciting avenues for further research. Beyond the work described in the paper, we are using the framework to train safety sensing and decision-making systems, monitor for cyber-security intrusions, visualise safety information, explore human-robot interaction and provide training and confidence in robotic safety assurance techniques.”
The study was funded by the Assuring Autonomy International Programme (AAIP) at The University of York and Lloyds Register.