Engineering news
The team at the Fraunhofer Institute for Biomedical Engineering (IBMT) in Sulzbach, Germany, are working as part of an EU research project to improve control of prosthetic hands, down to individual fingers.
Conventional myoelectric prostheses usually use electrodes placed on the skin, which pick up electrical signals from muscle contractions and send them to an electronics module, which controls the prosthesis.
With the Soma project, the researchers said they have shown that controlling prosthetic hands can be improved using ultrasonic sensors, with “far greater” accuracy and sensitivity in commands.
The new technique uses ultrasonic sensors that continuously send sound pulses into the muscle tissue in the forearm. Unlike electrical impulses, sound waves are reflected by tissue, and the time they take to propagate provides information about the depth of the muscle strand that is reflecting the wave. This allows contractions in the muscle tissue to be studied in great detail, the researchers said, allowing the identification of ‘activation patterns’ for specific movements.
The aim of the project is for AI-controlled software to handle that identification, via an electronic box worn on the user’s body. The electronics could then send decoded signals as commands to the actuators in the prosthetic hand, triggering movement of the prosthetic fingers, all in real time.
“The ultrasonic-based control acts with greater sensitivity and accuracy than would be possible with electrodes. The sensors are able to detect varying degrees of freedom such as flexing, extending or rotating,” said Dr Marc Fournelle, head of the Sensors & Actuators group at Fraunhofer IBMT.
Piezoelectric sound transducers used in the project send impulses at frequencies from 1-4MHz. At least 20 sensors are used, each one providing data about the position of muscle strands.
The sensors are currently integrated into a bracelet, which might be fitted into the shaft of the prosthetic hand at a later stage. Users have to complete a short training session, reportedly taking just a few minutes, to link the muscle signals with the correct fingers and desired movements.
Imperial College London developed the AI process for recognising movement patterns, and carried out initial testing on subjects.
In the next stage of the project, the researchers want to improve the temporal resolution of the sensors and make the electronics smaller, so that the prosthesis can be controlled even more accurately and comfortably. They also want to make the system bidirectional, with the brain receiving sensory stimuli from the prosthesis. The feedback could be delivered via electrodes implanted in, or onto, nerves.
“When someone who hasn’t lost their hand picks up a glass of water and holds it to their mouth, they get constant feedback from their fingers on how tight to hold the glass, so that on the one hand it doesn’t slip and fall, and on the other so it doesn’t shatter from being squeezed too tightly. Such functionality is also being investigated within Soma, and could one day be integrated into prosthetic hands,” said Andreas Schneider-Ickert, project manager in the Active Implants unit and innovation manager at Fraunhofer IBMT.
Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.