Our mission is to increase access to healthcare for rural and underserved communities by providing live 3D communication between patients and doctors. We employ Microsoft’s HoloportationTM technology and low-cost Azure Kinect sensors to capture and stream live 3D content of patients to remote clinicians for clinical consultation in real-time.
Existing telemedicine solutions that rely on 2D video communication fail to provide doctors and surgeons with the same quantity and quality of information as in-person consultation. Our research narrows this gap, demonstrating that 3D provides increased patient satisfaction, realism, and quality when compared to standard 2D telemedicine [1].
Our system generates a real-time 3D model of the patient, providing a true-to-life 360-degree view to the clinician that can be used for both patient evaluation and to improve informed discussions with patients. The system is applicable to a wide range of medical scenarios and is currently being used in reconstructive plastic surgery.
This research is an international collaboration between Microsoft, the Canniesburn Regional Plastic Surgery Unit in Scotland, the West of Scotland NHS Innovations Hub, and the National Reconstructive Plastic Surgery and Burns Centre at Korle Bu Teaching Hospital in Ghana. In 2020, the system was implemented as a fast-tracked COVID-19 research project in Scotland, to develop new and improved methods of remote consultation during the pandemic. Reconstructive plastic surgeons are using the 3D telemedicine system for pre- and post-op patient assessment—for example in patients with burns, cancer, or trauma reconstruction. The future goal is to bring the system to remote parts of Scotland to facilitate communication between specialists and patients.
Technology
The system consists of 10 capture devices radially positioned around the patient, each one containing an Azure Kinect sensor connected to a NVIDIA Jetson Nano computer. The Azure Kinect includes both a standard color camera and a depth camera, which allow us to capture the color and the corresponding distance or depth to each pixel captured. These data are collected by the NVIDIA Jetson Nano computers and sent to a GPU-powered workstation in which HoloportationTM algorithms fuse the depth maps to produce a streaming 3D model of the patient. The model is colorized on a secondary rendering desktop and sent to a remote viewer application through which the clinician interacts in real-time.
On the patient’s side, the system includes a monitor and audio system for a Microsoft Teams video call with the clinician. The interaction with the 3D model can be screenshared with the patient for enhanced communication. This includes 3D drawing tools on the remote viewer app that can be used by the surgeon on the patient’s model to explain details of the surgery, providing a ‘personalized’ medicine approach to the patient’s care.
Participatory development and validation
Patients were placed at the heart of the development process, and were involved in early discussions on functionality, quality, and usability. The 3D telemedicine system subsequently underwent clinical testing against standard 2D telemedicine [1], demonstrating improved patient metrics that included validated metrics of satisfaction, realism or “presence,” and quality. Safety and clinical concordance of 3D telemedicine with a face-to-face consultation were equivalent or exceeded estimates for 2D telemedicine. A randomized controlled trial is currently underway to provide definitive evidence of patient benefits.