{"id":465732,"date":"2018-02-12T14:27:06","date_gmt":"2018-02-12T22:27:06","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=465732"},"modified":"2020-03-13T17:14:29","modified_gmt":"2020-03-14T00:14:29","slug":"cardiolens","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/cardiolens\/","title":{"rendered":"Cardi\u2665lens"},"content":{"rendered":"
Imagine if you could look at someone and see their heart beat?<\/p>\n
Surgeons could see whether a transplanted organ\/tissue has blood flowing to it.\u00b7\u2009 Physical trainers would be able to see if athletes are in their “zone”.\u00b7\u2009 People could increase their awareness of their physiological impact on others.<\/p>\n
Using augmented reality we built a pair of glasses that does exactly that.<\/p>\n
<\/p>\n
<\/p>\n
Figure 1. Cardiolens is a mixed reality system that allows non-contact physiological measurement of multiple people in real-time. The physiological signals are visualized in the real-world allowing the wearer to see magnified blood perfusion and vital signs in real-time.<\/p><\/div>\n
We augment the appearance of the subject with the blood flow signal. Augmenting the real-world with physiological signals has key advantages as holograms can be displayed on the objects\/people of interest interfering with other elements.<\/p>\n
Figure 2. The user is given feedback about a detected face via a white box overlaid on their real-world environment that highlights the ROI being analyzed. The heart rate is displayed next to the box.<\/p><\/div>\n
The heart rate of the wearer is displayed in the top of the facial region. The semi-transparent mesh is augments that skin in real-time. An optional pulse wave plot can be displayed below the facial region.Version 2.0 will allow visualization of arousal\/stress from changes in the HRV parameters.<\/p>\n
Tracking.<\/b><\/p>\n Sensing.<\/b><\/p>\n Visualization.<\/b><\/p>\n We used the Microsoft Hololens and implemented the system in C#. All the processing is performed on the device and images captured using the front-facing camera. The device does not need to be tethered to a computer. The results demonstrated good performance for capturing users’ physiological signals despite camera and head motions.<\/p>\n We validated Cardiolens against gold-standard contact sensor measurements. The mean absolute error was 1.62 beats-per-minute. Figure 3. Examples of pulse waveforms from iPPG measurements and the contact sensor.<\/p><\/div>\n Figure 4. Scatter plots of the heart rate estimates from iPPG measurements and the contact sensor.<\/p><\/div>\n <\/p>\n Presented at ACM SIGGRAPH Emerging Technologies Selected for the SXSW Innovation Awards Imagine if you could look at someone and see their heart beat? Surgeons could see whether a transplanted organ\/tissue has blood flowing to it.\u00b7\u2009 Physical trainers would be able to see if athletes are in their “zone”.\u00b7\u2009 People could increase their awareness of their physiological impact on others. Using augmented reality we built a pair […]<\/p>\n","protected":false},"featured_media":465837,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13562,13551,13554,13553],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-465732","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-computer-vision","msr-research-area-graphics-and-multimedia","msr-research-area-human-computer-interaction","msr-research-area-medical-health-genomics","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[428883,481389,707086],"related-downloads":[],"related-videos":[466320],"related-groups":[144794,379814],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"guest","display_name":"Christophe Hurter","user_id":465750,"people_section":"Section name 1","alias":""}],"msr_research_lab":[199565],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/465732","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":22,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/465732\/revisions"}],"predecessor-version":[{"id":642798,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/465732\/revisions\/642798"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/465837"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=465732"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=465732"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=465732"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=465732"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=465732"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}\n
Face tracking and skin segmentation algorithms are used to locate the regions of interest (ROIs) in in-coming frames from the front-facing camera.<\/ol>\n<\/li>\n<\/ol>\n
\n
Remote imaging photoplethysmography (iPPG) is used to recover the blood volume pulse, heart rate and heart rate variability of the person you are looking at. Imaging PPG is an advanced set of computer vision methods that enables measurement using just the camera and ambient light – it does not requre calibration.<\/ol>\n<\/li>\n<\/ol>\n
\n
\n
We augment the appearance of the partner with the blood flow signal by altering the brightness of the skin on their face by displaying a semi-transparent mask. We developed a holographic overlay which shows the pulse signal superimposed onto the face using a linear image processing pipeline.<\/ol>\n<\/li>\n<\/ol>\n<\/li>\n<\/ol>\n
<\/p>\n
Validation<\/h1>\n
\nUsing a peak detection algorithm we were able to identify the inter-beat intervals of the heart.<\/p>\nRecognition<\/h1>\n
\n<\/p>\n
\n<\/p>\n","protected":false},"excerpt":{"rendered":"