Extended Reality for Robot-Supported Assembly in Human-Centered Industry 5.0
XR4Human-SERVE 5.0 aims to design human-centred neuroergonomic assembly workstation through the utilization of XR and robotics and validate it in the relevant industrial environment. Assembling different products in low batches is a typical SME challenge which is costly and requires a high level of worker attention and engagement making the manufacturing laborious.
Thus, the overarching objective of this project is to build on the previous efforts and results to increase the productivity of such processes while improving workers’ satisfaction by providing illustrative user interfaces (UI), non-invasive human-robot interaction (HRI) interface, and mental focus assessment (MFA) of the workers through XR technology.
The objective will be achieved by using MASTER-XR framework and a single headset device for XR-supported robotic setup. To that end, ICEF (as a technology provider) with a highly competent team in robotics and XR with rich industrial experience engage to explore, adapt, and exploit MASTER-XR framework to GALEB’s use case and develop XR4Human-SERVE 5.0 setup. Furthermore, due to its compactness, portability, and adaptability for other laborious applications in industry, the targeted setup has the potential to impact societal aspects through well-being at the workplace and economic aspects through efficiency and productivity.
Technology
The project aims to support Galeb, an SME manufacturing company, to improve its business by leveraging XR HRI experiences to improve efficiency and workers’ satisfaction at the assembly line. This improvement will come from:
- Enabling faster onboarding of the worker to the given task at the assembly line.
- Enabling intuitive and frictionless interface with the robot.
This will be implemented by leveraging multi-modal interaction systems, mainly eye-gaze, hand gestures and voice recognition which are enabled by XR HMDs. The assembly line use case is common for many industry applications (electronics, automotive, etc.) demonstrating the added value of XR technology and opening new XR industrial applications.
Therefore, migrating human-robot interaction modules (UI and MFA) to a single HMD AR device will be unique, beyond state-of-the-art technology that enables a more seamless implementation of the UI with instructions, gesture detection, fatigue estimation, and communication with the Cobot.