Engineering news
An international team from ETH Zurich and New York University created the “user-friendly, stretch-sensing data glove” to capture “real-time, interactive hand poses” with more precision than conventional techniques.
Motion capture generally depends on camera systems such as the Xbox Kinect, or a combination of cameras and handheld controllers. The new device captures movement without cameras or other controllers however, potentially making it more accessible for a range of applications including augmented and virtual reality (AR and VR), robotics and biomedical engineering.
“There are distinct advantages in not having to set up camera systems,” said independent expert David Bisset to Professional Engineering. “There are a lot of people working in areas with low lighting… you can’t put a camera in all the areas you are going to be working.”
AR and VR markets could be “quite big consumers” of the technology if it is easy to wear and remove and if it fits a variety of hand sizes, said Bisset, who is executive director of the European Robotics Association and was not involved in the work. He said the technique could be a cost-effective motion capture process, which would also potentially be useful for teleoperation of robots such as the anthropomorphic hand from London’s Shadow Robot Company. Other uses could be simply monitoring people’s hand movements or training industrial machines, he said.
Hands on
To make the device, the researchers attached silicone fitted with 44 stretch sensors to a glove made of soft, thin fabric. The team reconstructed poses from the sensor readings by using a “data-driven model that exploits the layout of the sensor itself”. The model was trained once using an inexpensive, off-the-shelf poseable hand.
The researchers tackled hurdles such as capturing hand motions in a variety of environments and settings, as well as using only user-friendly equipment and an easy-to-learn set-up approach. They demonstrated the stretch-sensing soft gloves accurately computing hand poses in real-time, even while the user was holding a physical object or in low lighting.
“To our best knowledge, our gloves are the first accurate hand-capturing data gloves based solely on stretch sensors,” said Oliver Glauser, lead author of the work and a PhD student at ETH Zurich. “The gloves are soft and thin, making them very comfortable and unobtrusive to wear, even while having 44 embedded sensors. They can be manufactured at a low cost with tools commonly available in fabrication labs.”
Mass manufacture will be key for any widespread use, said Bisset. Haptic feedback would be useful and more degrees of movement – the captured data only represents a hand changing pose in a stationary position – would be needed for operating entire robotic arms, he added.
The team said it will explore a similar approach for tracking a whole arm, including global position and orientation. They could even try a whole body suit – but will likely try different sizes of glove first.
They will demonstrate the work at Siggraph 2019 in Los Angeles, between 28 July and 1 August.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.