In cases of sophisticated manufacturing operations, like composites layup, workers’ expertise is a valuable asset that cannot always be replaced. When operators are in the loop, it is integral to ensure the human centricity of the system towards its acceptance and efficient usage. To this direction, two key requirements must be met. Firstly, it is essential to monitor human activities and preprocess them for extracting “human to system” input information. Secondly, operators need access to valuable information from the digital twin to comprehend common goals, understand robot actions, and easily access safety or performance critical events. This “system to human” information output also closes the circle of bilateral communication. This blogpost presents the main targets and challenges associated with the design and development of a new, multi-modal, human system interaction platform that takes advantage of extended reality technology.

By relying on human perception data, our module can identify interactions with extended reality buttons, grasping interfaces, and in parallel identify improper handling actions or error events during hybrid co-manipulation scenarios. In parallel, digital content augments the physical workstation for intuitive operator support

The Operator Tracking for Co-manipulation (OTC) module serves the purpose of monitoring human activities and preprocessing them for handling inputs. By relying on 3D stereo image data, the OTC module can identify interactions with extended reality buttons, monitor the coordinates of the operator’s palms as active grasping points, identify improper handling actions or error events, and perceive the actual workstation scene for accurate operator support content.

It is evident that within human robot co-manipulation, operator awareness is of utmost importance. The operator needs to be certain about the precise location and method for positioning the fabric, while concurrently receiving feedback concerning the quality of the provided handling inputs.

Under this notion, dedicated human machine interfaces have been designed. These interfaces allow operators to comprehend common goals, describe robot agents’ actions, and easily access safety or performance critical events in an intuitive manner.

Which technologies are utilized and how do the developed modules affect the operator performance?

The Operator Support Module is an application specifically designed to enhance the operator’s experience and performance during fabric manipulation tasks. It involves two applications, the first is hosted on an Android tablet mounted on the robot manipulator, facing the operator. This module provides valuable information and extends the physical workspace by overlaying data and graphics from the digital twin onto the physical workstation. Augmented reality (AR) technology is utilized to create an immersive experience for the operator. By calibrating the relative position of the camera and the tablet within the digital twin, the module can accurately project graphics onto the live image feed from the OTC module’s stereo camera. This allows important cues and instructions to be presented directly within the operator’s workstation. The second application is hosted on an edge computer and uses live image feed from a top-view camera for serving the same functionalities, mostly when the manipulator and the operator are not in proximity.

Extended reality buttons play a significant role in the operator’s interaction with the system. These buttons are displayed as floating circular shapes strategically placed within the workstation. The OTC module monitors the proximity of the operator’s hands to detect interactions with the buttons. In addition to extended reality buttons, the module also offers touch buttons on the tablet’s screen, physical workstation buttons and voice commands for operating the available interaction function blocks.

During the pre-grasping phase, the Operator Support Module instructs the operator by highlighting the intended contour of the fabric on the workbench. This guidance helps the operator understand where to unfold the fabric in terms of position and orientation. Additionally, the module displays the arrangement of grasping points for the upcoming co-manipulation process, clarifying the allocation of human and robot grasping areas. The human grasping points also function as extended reality buttons, enabling a grasping action when the operator’s palms are placed over the designated grasping points for at least one second

During the co-manipulation process, the Operator Support Module provides real-time guidance to the operator regarding the placement location of the manipulated fabric. An augmented floating arrow continuously points to the coordinates where the fabric’s centroid should be placed. If the placement location is within the live image stream field of view, an augmented marker visualizes the exact point. Additionally, a progress bar indicates the quality of the operator’s handling inputs, providing feedback on the effectiveness of the current grasping strategy. This information allows the operator to make real-time adjustments and optimize their approach for improved results.

The Operator Support Module utilizes a score generated by the model-based planner to evaluate the quality of the operator’s handling inputs. This score is represented by a progress bar that changes color from green to red based on the score’s value. A green progress bar indicates high-quality handling inputs and an efficient handling process, while a red progress bar indicates improper handling commands that pose risks. In such cases, the module pauses supportive robot manipulators and notifies the operator to prevent any potential accidents or errors.

By combining the live image feed from the OTC module, AR graphics, and optimization scores, the Operator Support Module facilitates fabric manipulation tasks. It enhances operators’ understanding of task requirements, improves their decision-making process, and ultimately contributes to higher productivity and quality outcomes.

The development of a new, multi-platform, AR human machine interface for fabric co-manipulation tasks can revolutionize the way operators interact with robotic systems. The Operator Support Module, taking advantage of the augmented reality technology , provides operators with valuable information, guidance, and support throughout the fabric manipulation process. By empowering operators to make real-time adjustments and optimize their approach, this interface enhances their performance, accuracy, and overall productivity.


Panagiotis Kotsaris, Research Engineer, LMS, University of Patras

Panagiotis Kotsaris works as a research engineer at the “Robots, Automation and Virtual Reality in Manufacturing” group of Laboratory for Manufacturing Systems and Automation (LMS). He has extensive experience in Extended Reality applications development, web development and multi-modal Human Machine Interfaces.