Ultrasound Identifies Hand Gestures, May Lead to Hands Free Control of Surgical Systems


At the University of Bristol in the U.K., researchers have managed to use ultrasound to detect the hand gestures that a person is displaying. While there are many consumer applications of the technology, such as playing video games and controlling devices around the house, this new “physiologic”‘way may end up being used by surgeons to browse hands-free through radiological images during procedures. This can help to maintain sterility while avoiding having to have another clinician control the imaging device’s interface. The same approach may also lead to a time when in-clinic touchscreens, which currently help to spread infections, may no longer be necessary. We can also envision this technology being used for rehab of patients who recover from the stroke or musculoskeletal disease.

The researchers used a conventional ultrasound probe to image the muscles of the forearm while different gestures were performed. The investigators then used computer vision and machine learning tools to correlate muscle movements to the hand gestures that they produce. Working backwards, the system was able to identify which gestures were produced when different muscle motions were detected. Moreover, the technology proved itself even when the muscle motion was detected at the wrist, the location where smartwatches of the future that contain ultrasound transducers will reside.

Take a look at this demo video showing off the technology:

Flashbacks: Gestureplex Wrist Controller for Hands Free Operation of Devices in Surgical Theater…Microsoft’s Kinect Technology Utilized for Vascular Surgery…Robotic Assistant Offers a Helping Hand in the OR…New System for Hands-Free Control of Image Viewer During Surgery…Controlling Augmented Reality in the Operating Room…Real-Time Touch-Free Gesture Control System for Image Browsing in The OR…Low Cost Glove Translates Sign Language, May Be Used to Practice Surgery in Virtual Reality…

Paper presented at Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems: EchoFlex: Hand Gesture Recognition using Ultrasound Imaging…

Via: University of Bristol…

Source: Medgadget


Please enter your comment!
Please enter your name here