Engineering news
“Pointing gestures, the perception of another’s gaze, intuitively knowing where someone’s attention is – in remote settings, we lose these nonverbal, implicit cues that are very important for carrying out design activities,” said Mose Sakashita, a doctoral student of information science.
Sakashita presented the robot, called ReMotion, at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems in Hamburg, Germany. “With ReMotion, we show that we can enable rapid, dynamic interactions through the help of a mobile, automated robot.”
The device is near six feet tall, with a monitor for a head, wheels and game-engine software. It mirrors the movements of a remote user wearing another device, called NeckFace, which tracks head and body movements.
In the past, telepresence robots have been manually controlled, distracting from the task at hand.
In a small study, nearly all participants reported having a better connection with their remote teammates when using ReMotion compared to an existing telerobotic system. Participants also reported significantly higher shared attention among remote collaborators.
In its current form, ReMotion only works with two users in a one-on-one remote environment, and each user must occupy physical spaces of identical size and layout. In future work, ReMotion developers intend to explore asymmetrical scenarios, like a single remote team member collaborating virtually via ReMotion with multiple teammates in a larger room.
With further development, Sakashita says ReMotion could be deployed in virtual collaborative environments as well as in classrooms and other educational settings.
Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.