Intelligent Digital Surgeon – Impulse InnoCH

Intelligent Digital Surgeon
This project aims to design and model a 3D intelligent digital surgeon to guide trainees in virtual surgery.  The digital surgeon will identify the trainee’s gestures by recognizing and analyzing the trainee’s hand/arm gestures and provide personalized assessment, real-time feedback, instructions, and recommendations. The proposed system will adopt a distributed software architecture taking advantage of edge-cloud resources and 5G networking to enable ultra-low latency untethered VR.
The IDS motion captured data containing digital surgeon’s know-how gestures are provided by Hospital of Geneva (HUG), and then visualized by MIRALab to allow digital representation of these surgeon’s gestures. 
A deep learning model will be derived by UNIGE from recording skilled real surgeons’ gestures to assist the feedback decision engine of the digital surgeon. A distributed VR system architecture will be implemented utilizing edge computing and 5G networks by UNIGE. 
ORamaVR will implement the deep learning model and in their MAGE system. Furthermore, ORama VR will embed methods of cutting and tearing of physics-based deformable surfaces that advance the state-of-the-art.

 

Scientific goals:

UNIGE together with HUG and the support of MIRALab sarl will provide a dedicated intelligent digital surgeon (IDS) that is to be used for many different applications as:

1. An animatable digital surgeon

2. A dataset containing the right gestures for performing incisions and sutures

3. A dataset containing the trainee’s gestures

4. A dataset containing the surgeon’ advices to the trainee’s wrong gestures

5. A hand gestures recognition module

6. An analytical tool to compare the gestures done by the trainee and the right gestures from the dataset.

7. An analytical tool that will advise the trainee and show her/him how to perform the right gestures.

Modeling of the animatable digital surgeon

Arms and hands rigging

Preliminary results