One of the many fields in which electrical engineering innovations have made a profound effect is in prosthetic technology. Whether the prosthetics and exoskeletons are for people with missing limbs or other immobilizing conditions, the applications of these devices can even extend beyond medical uses to AR/VR-based learning, robotics, and industrial applications.
A high-level diagram showing the positions researchers use to give context to the motion sensors. Image used courtesy of Nature Electronics and UC Berkeley
Various universities have been investigating prosthetics and are trying to improve prosthetic designs when it comes to sensors, movement, and the sensation of touch.
Gaining Physiological Data with MXene E-Skin Sensors
A primary objective of prosthetics is to help those who have partial to no ability to use their limbs. This objective has led most prosthetic research to focus on artificial hands and legs. To mimic a device as close to a real biological structure as possible, researchers from across fields—materials science, manufacturing, electronics, and biomedical science—must merge their expertise.
A recent publication from King Abdullah University of Science and Technology (KAUST) on electronic skin (or “e-skin”) sensors described a new material developed based on MXene-hydrogel heterostructures. E-skins are typically created by layering an active nanomaterial on a surface attached to human skin or prosthetic arm. KAUST researchers have developed a thin, stretchable, and comparably more durable material that can more accurately function like human skin.
E-skin created from MXene-hydrogel heterostructures. Image used courtesy of KAUST
The newly-developed e-skin was created by layering a vinyl silica nanoparticle–polyacrylamide (VSNP-PAM) hydrogel as the elastic substrate, a 2D MXene nanosheet as the sensing arrays, and 1D polypyrrole nanowires (PpyNWs). The hydrogen bonding within the compound gives it the ability to be tougher (~7020 J/m2) and have less hysteresis (<0.1) compared to existing hydrogels.
KAUST claims that its prototype e-skin can sense objects from 20 centimeters away, respond to stimuli in less than one-tenth of a second, and when used as a pressure sensor, distinguish handwriting written upon it.
Skin-attachable MXene-PpyNW-VSNP-PAM–based e-skin placed on the forearm. Image used courtesy of KAUST
This sensor continued to work well after 5,000 deformations, recovering in about a quarter of a second each time. Researchers posit that this sensor can transmit a range of different physiological data that can help develop treatment plans and create training programs to alleviate injuries.
Adding the Sensation of Touch
One issue with designing prosthetics (both in human and robotic use) is simulating the sensation of touch. Late last year, researchers at Cornell University experimented with fiber-optic sensors that combine low-cost LEDs and dyes, which resulted in a stretchable “skin” that detects deformations such as pressure, bending, and strain.
A 3D-printed glove lined with stretchable fiber-optic sensors uses light to detect a range of deformations in real-time. Image used courtesy of Cornell University
By using a mathematical model, the team at Cornell could discriminate different deformations and pinpoint their exact locations and magnitudes. The researchers claim that this device can create immersive AR/VR technology such as an augmented reality simulation to teach users different skills through guided movement.
For example, users can learn tasks like changing a tire with the glove simulating the feeling of tightening nuts and bolts. This technology may eventually be implemented on prosthetics to give the user the sensation of touch and provide the wearer better control over dexterity and motion.
Prosthetics with AI
Through artificial intelligence, it is possible to emulate the complex functions of the hand. Researchers from UC Berkeley have pursued this complicated simulation by developing a wearable biosensor with artificial intelligence software. This software is said to recognize what hand gesture a person intends to make based on electrical signal patterns in the forearm.
Wearable sensors with integrated AI for prosthetic control. Image used courtesy of UC Berkeley
The UC Berkeley researchers state that their hand gesture recognition system can classify up to 21 different hand signals. Based on a hyperdimensional computing algorithm, it can update itself with new information (the more you do it, the better it gets) with an added advantage of local on-chip computing, reducing privacy breach concerns.
Could Comfort Lead to Mainstream Adaptation?
One commonly-overlooked design consideration for prosthetics is user comfort. Certain prosthetics research, though state-of-art, haven’t matured into mainstream adoption—partly because of cost challenges and partly because of innate design challenges related to the weight, shape, size, and comfort distinctions of different users.
CYBERLEGSs Plus Plus prosthetic leg (left) and DeTOP’s prosthetic hand (right). Images used courtesy of CYBERLEGs Plus Plus and DeTOP
CYBERLEGs Plus Plus is developing robotic exoskeletons (robotic leg and brace) that use sensors connected to two motors to predict and anticipate movement. These exoskeletons may allow amputees to walk and climb stairs with reduced effort and prevent them from falling. This prosthetic also includes pressure-sensitive insoles and does not change the wearers’ gait, thus improving the wearer’s comfort.
Using prosthetics, DeTOP’s research addresses the recovery of hand function after amputation. The company has successfully demonstrated a new implant system in a patient who can now dexterously control her hand prosthesis. DeTOP also claims that its prosthetic is more comfortable than basic socket versions through its osseointegration process.
Catch Up on Other Electronic Innovations in Prosthetics
Prosthetic sensors are a hotbed of research. Read up on other research developments in this field below.