AI Powered Intutive Prosthetic Hand




It is a three-years-old project in BrainRobotics Company. The goal of the project is to develop a bionic prosthetic hand which can be control like a real hand.

I am the leader of the engineer team as well as the algorthm and embedded system developer. I developed the custom neural network powered Intuitive Control Algorithm, the Linux Kernel Module and the Embedded System. I also involved in the developemnt of the GUI, the circuit and the machenical design.
This work won the CES 2019 Innovation Awards under Accessibility category.
The following videos show normal people and amputee testing the prosthetic hand.

Prosthetic Hand Tested by Amputee Ni

By the way, I'm the person behind the camera.

Intutive Contorl Algorithm Tested by Normal People

Anlu, Amputed Girl, Play Piano with Word Famous Pianist Langlang

Pictures of The Prosthetic Hand

Nerual Network Designed Based on The Structure of Forearm

Bionic Deep Learning

Regression Algorithm for Intuitive Control

As shown in the video above, the hand can be controlled intuitively by amputees in the same way they used to control the hand before amputation. Amputees can move each invidual finger, do any hand gesture and even apply different force from different finger.

The diagram of the algorithm is shown above. However, since the algorithm is BrainRobotics' property, I can't show more detail here. For professors who I'm applying to work with, the description of this algorithm can be found in my statement of objective.

Muscle Memory

Because of the missing of the connection between flexor tendons and extensor tendons, amputees can't feel the position of fingers on the prosthetic hand. Amputees have to use the visual feedback to control the location of each finger. It means that amputees have to look at the hand all the time while they move fingers. Inspried by the muscle memory, I'm developing an algorithm to solve the problem partially. The algorithm can assist amputee moving the hand by learning the amputee's habit.

Since the algorithm is BrainRobotics' property, I can't expose more detail. However, the mechanism behind the algorithm is similar to the algorithm I developed in the Habit Learning Wheelchair project. You can get more information there.

Past Algorithms

The following two sketch block diagrams are demonstrating the algorithms I have developed before for the prosthetic hand.

Triditional Machine Learning Approach

I developed the first one in 2016 following the approach from established academic papers. The algorithm can classify 13 gestures with 80% accuracy.

Deep Neural Network Classification

The second one is developed in 2017. I repaced the feature extraction part with convolutional neural network and the SVM part with fully connected neural network. There is a huge improvement in accuracy. However, it's still classification instead of the intuitive control.

Get EMG Data from ADS1299 in Read-Time

Linux Kernel Module

Why Linux?

Comparing to bare metal or RTOS, Linux system requires more hardware resource. However, Linux system can provide the following benefits none of the other solution can provide.

  • Well supported libraries
    • Tensorflow
    • Pytorch
    • ...
  • Well supported device drivers
    • Wifi
    • Bluetooth
    • USB
    • ...
  • Well designed file system
  • Robust task schedualing

Why Linux Kernel?

Developing a linux kernel module is not a easy task. However, developing device driver in kernel space can provide a lot of benefits.

  • Interrupts Supported
    • Fast response: The sample rate of ADS1299 is up to 16k Hz. Data packet have to be read in a short time period. Using interrupt can make sure no packet losting.
    • Low CPU usage: Because of interrupt, system don't need to keep inquire the Data Ready (DRDY) pin.
  • Direct Memory Access (DMA)
    • Low CPU usage: Reading data from SPI can be handled by DMA automatically instead of CPU.
  • Virtual file system
    • The ADS1299 device can be configured by writing configuration data in to a file in the user space. Data from ADS1299 can also be collected by reading a file in the user space.

Kernel Module Implementation

The image above shows the structure of the Kernel Module. I used IIO subsystem of Linux as the freamwork. I'm negotiating with BrainRobotics now to publish my Linux Kernel Module to read data from ADS1299 to Linux Kernel Tree to make it open source. Before that, I can't show the source code here.

Device Tree

Only having the Linux Kernel Module is not enough. To let the Linux system know under which port to find the device, Device Tree have to be developed. The following image shows how the Device Tree is loaded.

Device Tree Code Example Registering ADS1299's SPI Port

RTOS, PID, Encoder, Current Feedback

Embedded System

The whole embedded system running on the prosthetic hand was also developed by me.

System Architecture


The chip I used was under STM32F4 family. So I used the STM CUBE code generator to generate libraries for the circuit board. The chip I/O configuration and clock configuration are shown below.