Deep Learning Enables Intuitive Prosthetic Control

Prosthetic limbs have been slow to evolve from simple motionless replicas of human body parts to moving, active devices. A major part of this is that controlling the many joints of a prosthetic is no easy task. However, researchers have worked to simplify this task, by capturing nerve signals and allowing deep learning routines to figure the rest out.

The prosthetic arm under test actually carries a NVIDIA Jetson Nano onboard to run the AI nerve signal decoder algorithm.

Reported in a pre-published paper, researchers used implanted electrodes to capture signals from the median and ulnar nerves in the forearm of Shawn Findley, who had lost a hand to a machine shop accident 17 years prior. An AI decoder was then trained to decipher signals from the electrodes using an NVIDIA Titan X GPU.

With this done, the decoder model could then be run on a significantly more lightweight system consisting of an NVIDIA Jetson Nano, which is small enough to mount on a prosthetic itself. This allowed Findley to control a prosthetic hand by thought, without needing to be attached to any external equipment. The system also allowed for intuitive control of Far Cry 5, which sounds like a fun time as well.

The research is exciting, and yet another step towards full-function prosthetics becoming a reality. The key to the technology is that models can be trained on powerful hardware, but run on much lower-end single-board computers, avoiding the need for prosthetic users to carry around bulky hardware to make the nerve interface work. If it can be combined with a non-invasive nerve interface, expect this technology to explode in use around the world.

[Thanks to Brian Caulfield for the tip!]

How A Quadriplegic Patient Was Able To Eat By Himself Again Using Artificial Limbs

Thirty years ago, [Robert “Buz” Chmielewski] suffered a surfing accident as a teenager. This left him as a quadriplegic due to a C6 spinal cord injury. After becoming a participant in a brain-computer interface study at Johns Hopkins, he was recently able to feed himself through the use of prosthetic arms. The most remarkable thing about these prosthetic arms is primarily the neural link with [Buz’s] brain, which allows him to not only control the artificial arms, but also feel what they are touching, due to a closed-loop system which transfers limb sensory input to the patient’s brain.

The prosthetic limb in question is the Modular Prosthetic Limb (MPL) from Johns Hopkins Applied Physics Laboratory (APL). The Johns Hopkins Medicine Brain-Computer Interface study began a year ago, when [Buz] had six microelectrode arrays (MEA) implanted into his brain: half in the motor cortex and half in the sensory cortex. During the following months, the study focused on using the signals from the first set of arrays to control actuators, such as the MPL. The second set of arrays was used to study how the sensory cortex had to be stimulated to allow a patient to feel the artificial limb much as one feels a biological limb.

What makes this study so interesting is not only the closed-loop approach which provides the patient with feedback on the position and pressure on the prosthetic, but also that it involves both hemispheres of the brain. As a result, after only a year of the study, [Buz] was able to use two of the MPLs simultaneously to feed himself, which is a delicate and complicated tasks.

In the video embedded after the break one can see a comparison of [Buz] at the beginning of the study and today, as he manages to handle cutlery and eat cake, without assistance.

Continue reading “How A Quadriplegic Patient Was Able To Eat By Himself Again Using Artificial Limbs”

3D Printed Prosthesis Reads Your Mind, Sees With Its Hand

Hobbyist electronics and robotics are getting cheaper and easier to build as time moves on, and one advantage of that is the possibility of affordable prosthetics. A great example is this transhumeral prosthesis from [Duy], his entry for this year’s Hackaday Prize.

Side views of the 3D printed prosthesis arm.With ten degrees of freedom, including individual fingers, two axes for the thumb and enough wrist movement for the hand to wave with, this is already a pretty impressive robotics build in and of itself. The features don’t stop there however. The entire prosthesis is modular and can be used in different configurations, and it’s all 3D printed for ease of customization and manufacturing. Along with the myoelectric sensor which is how these prostheses are usually controlled, [Duy] also designed the hand to be controlled with computer vision and brain-controlled interfaces.

The palm of the hand has a camera embedded in it, and by passing that feed through CV software the hand can recognize and track objects the user moves it close to. This makes it easier to grab onto them, since the different gripping patterns required for each object can be programmed into the Raspberry Pi controlling the actuators. Because the alpha-wave BCI may not offer enough discernment for a full range of movement of each finger, this is where computer aid can help the prosthesis feel more natural to the user.

We’ve seen a fair amount of creative custom prostheses here, like this one which uses AI to allow the user to play music with it, and this one which gives its user a tattoo machine for an appendage.

Continue reading “3D Printed Prosthesis Reads Your Mind, Sees With Its Hand”

Moving 3D Printed Prosthetic Arms With A Pulse

One of the best uses of 3D printers we’ve seen are custom prosthetics. Even today, custom-built prosthetics cost an arm and a leg, but there’s no reason why they should. Right now, we can scan someone’s arm or leg, import that scan into a 3D-modeling program, and design a custom-fit orthotic that can be spit out on a 3D printer. Now, we’re seeing some interesting methods of turning those 3D-printed parts into the beginnings of a cybernetic design. This is a custom printed robotic hand controlled by a pulse sensor. It’s in its early stages right now, but so far the results are promising and this is a great entry to The Hackaday Prize

This project draws upon a few of the team’s other endeavours. The first is a 3D-printed mini linear actuator, a project that made it into the finals of the Hackaday Prize in the Robotics Module challenge. This tiny linear actuator is actually powered by a tiny hobby servo rigged up for continuous rotation. Add in some 3D printed gears and a well-designed frame, and you have something that’s just as good as fantastically expensive linear actuators as a bargain basement price. This pulse sensor arm also makes use of the team’s TNS 1i, a 3D printed robotic hand that makes use of those tiny little linear actuators.

Of course, if you’re going to build a prosthetic robotic arm, you have to have some sort of brain-machine interface. Previously, the team was using Myoware muscle sensors to control the opening and closing of the fingers. This changed, however, when [Giovanni] was trying to get his Samsung gear S3 to detect his pulse. Apparently, moving your wrist when trying to get a smartwatch to listen in on your heartbeat is an acceptable substitute for a muscle sensor.

Rapidly Prototyping Prosthetics, Braille, And Wheelchairs

We live in an amazing time where the availability of rapid prototyping tools and expertise to use them has expanded faster than at any other time in human history. We now have an amazing ability to quickly bring together creative solutions — perfect examples of this are the designs for specialized arm prosthetics, Braille printing, and custom wheelchair builds that came together last week.

Earlier this month we published details about the S.T.E.A.M. Fabrikarium program taking place at Maker’s Asylum in Mumbai. The five-day event was designed to match up groups of makers with mentors to build assistive devices which help improve the condition of differently-abled people.

The participants were split into eight teams and they came up with some amazing results at the end of the five-day program.

Hands-On: Prosthetic Designs That Go Beyond

Three teams worked on projects based on Bionico – a myoelectric prosthesis

DIY Prosthetic Socket – a Human Machine Interface : [Mahendra Pitav aka Mahen] lost his left arm during the series of train bomb blasts in Mumbai in 2006, which killed 200 and injured over 700 commuters. He uses a prosthetic arm which is essentially a three-pronged claw that is cable activated using his other good arm. While it is useful, the limited functionality restricted him from doing many simple things. The DIY Prosthetic socket team worked with [Mahen] and [Nico Huchet] from MyHumanKit (who lost his right arm in an accident 16 years back), and fabricated a prosthetic forearm for [Mahen] with a modular, 3D printed accessory socket. Embedded within the arm is a rechargeable power source that provides 5V USB output at the socket end to power the devices that are plugged in. It also provides a second port to help recharge mobile phones. Also embedded in the arm was an IR reflective sensor that can be used to sense muscle movements and help trigger specific functions of add-on circuits, for example servos.

Continue reading “Rapidly Prototyping Prosthetics, Braille, And Wheelchairs”

[Marla]’s New Arm

It is especially rare to see coverage in the mainstream media that involves a hackspace, so it was a pleasant surprise yesterday when the local TV news where this is being written covered a story that not only highlighted a hackspace’s work, but did so in a very positive light.

[Marla Trigwell] is a young girl from Newbury, UK, who was born without a left hand. She had been provided with prosthetics, but they aren’t cheap, and as a growing child she quickly left them behind. Her parents researched the problem as modern parents do, and found out about recent advances in 3D-printed prosthetics lowering the bar to access for those like [Marla] born without a limb. Last month [Marla] received her new 3D-printed arm, and she did so courtesy of the work of [Andrew Lindsay] at Newbury and District Hackspace.

The arm itself is a Team Unlimbited arm version 2.0 Alfie edition, which can be found on Thingiverse with full sizing instructions for adjusting to the recipient in Customizer. As the video below the break shows, [Marla] appears very pleased with it, and is soon mastering its ability to grip objects.

This story is a fantastic demonstration of the ability of a hackspace to be a force for good, a true community organisation. We applaud [Andrew], NADHack, and all involved with it for their work, and hope that 3D printed arms will keep [Marla] with a constant supply of comfortable and affordable prosthetics as she grows up.

Continue reading “[Marla]’s New Arm”

Hackaday Prize Entry: Raimi’s Bionic Arm

Sometimes, the most amazing teams make the most wonderful things happen, and yet, there is just not enough time to finish all the features before the product ships. This is what happened to Raimi, who came to this world missing a right hand and half of her right forearm. Raimi is now 9 years old, and commercial mechatronic prostheses are still only available to those who can afford them. When Raimi’s father approached [Patrick Joyce] to ask him for help in building an affordable prosthesis, he knew it would matter, and went right to work.

Continue reading “Hackaday Prize Entry: Raimi’s Bionic Arm”