Regression Algorithm for Intuitive Control
As shown in the video above, the hand can be controlled intuitively by amputees in the same way they used to control the hand before amputation.
Amputees can move each invidual finger, do any hand gesture and even apply different force from different finger.
The diagram of the algorithm is shown above.
However, since the algorithm is BrainRobotics' property, I can't show more detail here.
For professors who I'm applying to work with, the description of this algorithm can be found in my statement of objective.
Because of the missing of the connection between flexor tendons and extensor tendons, amputees can't feel the position of fingers on the prosthetic hand.
Amputees have to use the visual feedback to control the location of each finger.
It means that amputees have to look at the hand all the time while they move fingers.
Inspried by the muscle memory, I'm developing an algorithm to solve the problem partially.
The algorithm can assist amputee moving the hand by learning the amputee's habit.
Since the algorithm is BrainRobotics' property, I can't expose more detail.
However, the mechanism behind the algorithm is similar to the algorithm I developed in the Habit Learning Wheelchair project.
You can get more information there.
The following two sketch block diagrams are demonstrating the algorithms I have developed before for the prosthetic hand.
Triditional Machine Learning Approach
I developed the first one in 2016 following the approach from established academic papers.
The algorithm can classify 13 gestures with 80% accuracy.
Deep Neural Network Classification
The second one is developed in 2017. I repaced the feature extraction part with convolutional neural network and the SVM part with fully connected neural network.
There is a huge improvement in accuracy. However, it's still classification instead of the intuitive control.