{"id":780706,"date":"2021-11-12T07:16:06","date_gmt":"2021-11-12T15:16:06","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-group&p=780706"},"modified":"2024-04-26T13:54:29","modified_gmt":"2024-04-26T20:54:29","slug":"biomedical-imaging","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/group\/biomedical-imaging\/","title":{"rendered":"Biomedical Imaging"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\"medical\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

Biomedical Imaging<\/h1>\n\n\n\n

Addressing the challenges of speed, quantification, and cost<\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n

Biomedical imaging plays a critical role in health and life sciences, from basic research to diagnostics and treatment. The imaging process is a complex chain involving sample\/subject preparation, excitation, data acquisition, signal processing, data storage, and image interpretation. Technology and software have been the drivers of innovation in every stage of this imaging chain. The stages of the imaging chain are closely connected, and we believe that transformative impact in biomedical imaging is best achieved by considering the entirety of the imaging experiment and implementing changes where they are most impactful. <\/p>\n\n\n\n

Current challenges in biomedical imaging<\/h3>\n\n\n\n
\n
\n
\"speed<\/figure>\n\n\n\n

Speed<\/strong><\/p>\n\n\n\n

Current imaging techniques tend to be slow. This may cause discomfort and radiation risk for patients under examination. It can delay throughput and image interpretation for both research and clinical applications.<\/p>\n<\/div>\n\n\n\n

\n
\"quantifcation<\/figure>\n\n\n\n

Quantification<\/strong><\/p>\n\n\n\n

The lack of quantitative imaging information is a problem for interpretation and also limits our abilities to develop algorithms and tools that automate interpretation.<\/p>\n<\/div>\n\n\n\n

\n
\"cost<\/figure>\n\n\n\n

Cost<\/strong><\/p>\n\n\n\n

Biomedical imaging instruments can be expensive, using specialized hardware and infrastructure. Along with their low throughput, these factors add to the cost of biomedical imaging.<\/p>\n<\/div>\n<\/div>\n\n\n\n

We believe that new technologies and software can be used to alleviate some aspects of these problems. We are exploring how novel signal processing techniques and AI can allow us to speed-up biomedical imaging, quantify imaging information, and help make imaging more accessible and affordable.<\/p>\n\n\n\n

Research focus areas<\/h3>\n\n\n\n

At Microsoft, we undertake research projects that can have transformative impact on biomedical imaging. While these projects may focus on a specific aspect of the imaging process, e.g., image interpretation, we consider the entire imaging process and improve technology at several levels. Our imaging projects are usually done in partnership with academic medical centers and medical device manufacturers.<\/p>\n\n\n\n

\n
\n
\"Connected<\/figure>\n\n\n\n

Connected Imaging Instrument<\/h4>\n\n\n\n

Microsoft\u2019s Connected Imaging Instrument <\/a>project is focused on developing technologies that will allow imaging instruments to leverage the cloud for on-demand processing and scalable storage. We believe the pivot to cloud computing for medical image signal processing will enable faster deployment of new techniques that will improve the patient experience and expand access to advanced diagnostics.<\/p>\n<\/div>\n\n\n\n

\n
\"Machine<\/figure>\n\n\n\n

Machine Learning for Image Reconstruction<\/h4>\n\n\n\n

Microsoft is developing machine learning based techniques that can reconstruct medical images from raw datasets (opens in new tab)<\/span><\/a> that are incomplete (undersampled) or corrupted by artifacts. We believe that machine learning will play a critical role in accelerating medical image acquisitions and improving the patient experience, and are particularly interested in how well machine learning based image reconstruction does in reconstruction of clinically relevant details and lesions. We have curated annotations for clinically relevant lesions (opens in new tab)<\/span><\/a> and are using such annotations to optimize reconstructions and workflow.<\/p>\n<\/div>\n<\/div>\n\n\n\n

\n
\n
\"Dr<\/figure>\n\n\n\n

Radiotherapy image segmentation<\/h4>\n\n\n\n

Project InnerEye (opens in new tab)<\/span><\/a> is research conducted in the Health Intelligence team at Microsoft Research Cambridge (UK) (opens in new tab)<\/span><\/a> that is exploring how machine learning (ML) has the potential to assist clinicians in planning their radiotherapy (RT) treatments so that they can spend more time with their patients. Our recent peer-reviewed research published in JAMA Network Open (opens in new tab)<\/span><\/a> shows that clinicians using ML assistance can segment images up to 13 times faster than doing it manually, with an accuracy that is within the bounds of human expert variability. Cambridge University Hospitals NHS Foundation Trust is working with University Hospitals Birmingham NHS Foundation Trust on an NHSX AI Award project (opens in new tab)<\/span><\/a> to deploy their own radiotherapy image segmentation ML models using Project InnerEye OSS and Microsoft Azure to help cut the time patients wait for their cancer treatment.<\/p>\n<\/div>\n\n\n\n

\n
\"eyeball<\/figure>\n\n\n\n

Opthalmology<\/h4>\n\n\n\n

As part of Microsoft\u2019s work with Novartis<\/a>, data scientists from Microsoft Research and research teams from Novartis have worked together to investigate how AI can help unlock transformational new approaches in several areas. The Health Intelligence team at Microsoft Research Cambridge (UK)<\/a> has worked with Novartis to look at how machine learning might be applied for personalized treatment for macular degeneration \u2013 a leading cause of irreversible blindness.<\/p>\n<\/div>\n<\/div>\n\n\n\n

\n
\n
\"Digital<\/figure>\n\n\n\n

Digital Pathology<\/h4>\n\n\n\n

There are key opportunities to improve patient diagnosis and care by augmenting digital pathology with AI. Looking at the challenge of shortage of medical expertise in medical image analysis in developing countries, researchers at MSR Asia have worked with partners to explore solutions to assist doctors to make diagnosis with computer vision, including a collaboration with Pfizer<\/a>. <\/p>\n\n\n\n

Our MSR New England Biomedical ML team (opens in new tab)<\/span><\/a> has partnered and Volsatra Therapeutics (opens in new tab)<\/span><\/a> to develop tools that help detect drivers of cancer metastasis. We will develop automated machine learning tools capable of rapidly and accurately integrating insights across multiple datasets, including pathology slides and three-dimensional tumor-derived organoids. <\/p>\n<\/div>\n\n\n\n

\n
\"microscopic<\/figure>\n\n\n\n

Pandemic Preparedness<\/h4>\n\n\n\n

The Project InnerEye team is supporting two projects as part of Microsoft\u2019s Studies in Pandemic Preparedness<\/a> research program. University Hospitals Birmingham NHS Foundation Trust<\/a> is using Project InnerEye OSS to develop ML models to support clinicians to diagnose and assess the severity of Covid-19 using AI and Chest X-rays with a human-interpretable class hierarchy that aligns with the radiological decision process. The team used data-driven error analysis<\/a> to uncover blind spots of the model, providing further transparency on its clinical utility. <\/p>\n\n\n\n

Researchers at University College London (UCL) and UCL Hospitals NHS Foundation Trust (opens in new tab)<\/span><\/a> are also using Project InnerEye OSS and Azure Machine Learning to see if chest imaging data can identify better who should shield in potential future COVID-19 outbreaks. This project uses the UK National COVID-19 Chest Imaging Database (NCCID), led by Dr Joe Jacob at UCL (opens in new tab)<\/span><\/a>.<\/p>\n<\/div>\n<\/div>\n\n\n\n

Project InnerEye Open-Source Software<\/h2>\n\n\n\n

Project InnerEye open-source software<\/a> (OSS) is created and used for deep learning research as part of Project InnerEye in the Health Intelligence team at Microsoft Research Cambridge (UK)<\/a> . It is released at no-cost under an MIT open-source license to make it widely available for the global medical Imaging community, who can leverage our work. The tools aim to increase productivity for research and development of best-in-class medical imaging AI and help to enable deployment using Microsoft Azure cloud computing (subject to appropriate regulatory approvals). Support for these OSS tools is via GitHub Issues on the relevant repositories.<\/p>\n\n\n\n

Learn more about Project InnerEye OSS ><\/a><\/p>\n\n\n\n

Biomedical imaging across Microsoft<\/h2>\n\n\n\n

The following biomedical imaging efforts at Microsoft complement the work in Microsoft Research.<\/p>\n\n\n\n

AI for Health (opens in new tab)<\/span><\/a><\/strong> is a $60 million, five-year philanthropic program from Microsoft, created to empower nonprofits, researchers, and organizations tackling some of the toughest challenges in global health. AI for Health supports biomedical imaging efforts across four areas: Quest for Discover, Global Health Insights, Health Equity, and Research capabilities. Biomedical imaging collaborations include Stanford Center AI in Medicine and Imaging (opens in new tab)<\/span><\/a>, BC Cancer (opens in new tab)<\/span><\/a>, SRL Diagnostics (opens in new tab)<\/span><\/a>, and Novartis Foundation (opens in new tab)<\/span><\/a>.<\/p>\n\n\n\n

Microsoft Azure for Healthcare (opens in new tab)<\/span><\/a><\/strong> provides services to deliver better health insights and outcomes as you enhance patient engagement, empower health team collaboration, and improve clinical informatics and operational insights \u2013 with trusted cloud capabilities. Find out more about Azure Healthcare APIs (opens in new tab)<\/span><\/a> for health data standards including Fast Healthcare Interoperability Resources (FHIR\u00ae) and Digital Imaging Communications in Medicine (DICOM).<\/p>\n\n\n","protected":false},"excerpt":{"rendered":"

We are exploring how novel signal processing techniques and AI can allow us to produce images from less data than is currently required.<\/p>\n","protected":false},"featured_media":783790,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"msr_group_start":"","footnotes":""},"research-area":[13556,13562,13553],"msr-group-type":[243694],"msr-locale":[268875],"msr-impact-theme":[],"class_list":["post-780706","msr-group","type-msr-group","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-computer-vision","msr-research-area-medical-health-genomics","msr-group-type-group","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[199560,199561,199563,849856],"related-researchers":[{"type":"user_nicename","display_name":"Javier Alvarez-Valle","user_id":32137,"people_section":"Section name 0","alias":"jaalvare"},{"type":"user_nicename","display_name":"Shruthi Bannur","user_id":39213,"people_section":"Section name 0","alias":"shbannur"},{"type":"user_nicename","display_name":"Kenza Bouzid","user_id":43290,"people_section":"Section name 0","alias":"kenzabouzid"},{"type":"user_nicename","display_name":"Daniel Coelho de Castro","user_id":39811,"people_section":"Section name 0","alias":"dacoelh"},{"type":"user_nicename","display_name":"Lorin Crawford","user_id":39660,"people_section":"Section name 0","alias":"lcrawford"},{"type":"user_nicename","display_name":"Miro Dud\u00edk","user_id":32867,"people_section":"Section name 0","alias":"mdudik"},{"type":"user_nicename","display_name":"Nicolo Fusi","user_id":31829,"people_section":"Section name 0","alias":"fusi"},{"type":"user_nicename","display_name":"Pratik Ghosh","user_id":38245,"people_section":"Section name 0","alias":"prghos"},{"type":"user_nicename","display_name":"Michael Hansen","user_id":40723,"people_section":"Section name 0","alias":"mihansen"},{"type":"user_nicename","display_name":"Stephanie Hyland","user_id":38458,"people_section":"Section name 0","alias":"sthyland"},{"type":"user_nicename","display_name":"Juan M. Lavista Ferres","user_id":39552,"people_section":"Section name 0","alias":"jlavista"},{"type":"user_nicename","display_name":"Lester Mackey","user_id":36161,"people_section":"Section name 0","alias":"lmackey"},{"type":"user_nicename","display_name":"Hannah Richardson (nee Murfet)","user_id":37703,"people_section":"Section name 0","alias":"hamurfet"},{"type":"user_nicename","display_name":"Anthony Ortiz","user_id":39715,"people_section":"Section name 0","alias":"anort"},{"type":"guest","display_name":"Ajay Paidi","user_id":859272,"people_section":"Section name 0","alias":""},{"type":"user_nicename","display_name":"Fernando P\u00e9rez Garc\u00eda","user_id":41473,"people_section":"Section name 0","alias":"fperezgarcia"},{"type":"user_nicename","display_name":"Caleb Robinson","user_id":39606,"people_section":"Section name 0","alias":"davrob"},{"type":"user_nicename","display_name":"Philip Rosenfield","user_id":37562,"people_section":"Section name 0","alias":"phrosenf"},{"type":"user_nicename","display_name":"Valentina Salvatelli","user_id":40612,"people_section":"Section name 0","alias":"vsalvatelli"},{"type":"user_nicename","display_name":"Anton Schwaighofer","user_id":31059,"people_section":"Section name 0","alias":"antonsc"},{"type":"user_nicename","display_name":"Kristen Severson","user_id":40372,"people_section":"Section name 0","alias":"kseverson"},{"type":"user_nicename","display_name":"Harshita Sharma","user_id":41602,"people_section":"Section name 0","alias":"harssharma"},{"type":"user_nicename","display_name":"Ava Amini","user_id":40432,"people_section":"Section name 0","alias":"avasoleimany"},{"type":"user_nicename","display_name":"John Stairs","user_id":40858,"people_section":"Section name 0","alias":"jostairs"},{"type":"user_nicename","display_name":"Kenji Takeda","user_id":32522,"people_section":"Section name 0","alias":"kenjitak"},{"type":"user_nicename","display_name":"Neil Tenenholtz","user_id":38464,"people_section":"Section name 0","alias":"netenenh"},{"type":"user_nicename","display_name":"Anja Thieme","user_id":35948,"people_section":"Section name 0","alias":"anthie"},{"type":"user_nicename","display_name":"Yixi Xu","user_id":39775,"people_section":"Section name 0","alias":"yixx"},{"type":"user_nicename","display_name":"Kevin Kaichuang Yang","user_id":39093,"people_section":"Section name 0","alias":"kevyan"}],"related-publications":[588289,256554,776380,487802,1009149,588295,256566,776422,490886,1012407,588301,256572,828454,490892,1047813,168300,588616,328487,844411,490898,1047825,168600,697009,328493,858441,490910,1047834,168601,707971,364436,917718,544644,1051485,168602,710251,364448,923706,557037,168700,745816,364454,936084,567804,168790,757420,364460,967293,588241,215422,759400,368156,981816,588271,238041,773551,487796,986898],"related-downloads":[693936,746395,746404,746410,773467,780823],"related-videos":[],"related-projects":[],"related-events":[],"related-opportunities":[],"related-posts":[],"tab-content":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/780706"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":53,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/780706\/revisions"}],"predecessor-version":[{"id":1029279,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/780706\/revisions\/1029279"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/783790"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=780706"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=780706"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=780706"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=780706"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=780706"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}