{"id":15888,"date":"2019-07-26T09:00:29","date_gmt":"2019-07-26T09:00:29","guid":{"rendered":"https:\/\/www.microsoft.com\/en-gb\/industry\/blog\/?p=15888"},"modified":"2019-08-29T13:46:15","modified_gmt":"2019-08-29T13:46:15","slug":"ai-medical-imaging-low-code","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-gb\/industry\/blog\/health\/2019\/07\/26\/ai-medical-imaging-low-code\/","title":{"rendered":"6 steps to using low-code tools to achieve better patient care"},"content":{"rendered":"
<\/p>\n
<\/p>\n
We\u2019re going to be sharing a story every week for the 12 weeks of summer, showing you how healthcare organisations are using technology to transform patient outcomes and increase productivity. For the fourth blog in our series, Nas Taibi, Solutions Architect, details how, thanks to low-code\/no-code services, introducing AI into medical imaging is no longer limited to coding experts.<\/em><\/p>\n The shift towards values-based care has seen healthcare facilities seeking low-code and no-code innovations that accelerate operational outcomes and create a financially sustainable care system.<\/p>\n One recent idea that\u2019s garnered attention in the enterprise imaging world is AI augmentation \u2013 using machine learning to process, analyse, and interpret medical images. Embedding the technology into enterprise imaging systems has seen improvements to clinicians\u2019 decision-making process and a lighter burden on reporting practitioners\u2019. Using low-code or no-code technology, professionals find they can work quicker and better than before.<\/p>\n But it has another far-reaching benefit: improving people\u2019s health by accurately detecting diseases early.<\/p>\n Imagine a healthcare facility. It\u2019s seeking to use machine learning to automate and improve the prediction accuracy of a foetus\u2019 gestational stage. Traditionally, sonographers have manually measured the biparietal diameter and head circumference using callipers.<\/p>\n The machine learning model analyses legacy ultrasound medical images, with manual measurements taken. It will then closely match the accuracy of the original sonographer\u2019s findings. With AI at their side, the facility can capture this data, then embed it in the ultrasound images, which can be used during training as a reference data point.<\/p>\n <\/p>\n Now, imagine an engineering team is prototyping a solution to gather, clean, and pre-process those images. The collected sample data will be used to train and test the model.<\/p>\n The medical imaging system stores the files in Azure\u2019s Binary Large Object storage. Adding each file triggers the Azure Logic App workflow. The message is pulled through and the blob URL extracted, before it grabs the JPEG and JSON from the DICOM image. Next, the system performs an optical character recognition on the image \u2013 essentially letting it \u2018see\u2019 the photo and pull out metadata.<\/p>\n And, best of all, it\u2019s all performed automatically.<\/p>\n <\/p>\n AI tools and framework adoption is growing in the healthcare sector. \u00a0Fully managed cloud-based machine learning services<\/a>\u00a0can be used to train, deploy, and manage models at scale. Then there\u2019s low code\/no code tools.<\/p>\n You don\u2019t need to be a technical expert to use them. There\u2019s no need to learn code.<\/p>\n All you need is access to Azure Cognitive Services<\/a>, which provide pre-built machine learning models. They\u2019re designed to help you \u201cbuild intelligent applications without having direct AI or data science skills or knowledge.\u201d<\/p>\n Helping streamline the development process is the low code Azure Machine Learning Studio<\/a>. It lets you deploy pre-built machine-learning algorithms, then connect datasets that integrate with custom apps.<\/p>\n Used together, these Microsoft services make it easy to transform the workplace without coding skills. You can, instead, focus on delivering improved ROI, a superior experience for employees, and even higher quality care.<\/p>\n <\/p>\n Healthcare facilities across the country face a similar problem: some ultrasound images are essentially screenshots of screenshots.<\/p>\n While modern machines are able to embed those all-important measurements into the image, these older images include those all-important measurements taken by the sonographer during the scan. The first, step, is obtaining these images.<\/p>\n The next step sees you extract the image\u2019s pixels, then use open-source tools to convert the original DICOM file to JPEG.<\/p>\n Armed with this JPEG, it\u2019s time to run the image through the Optical Character Recognition. Since this is achieved via Microsoft\u2019s smart Cognitive Services, it’s easy to perform.<\/p>\n Watch out though, as this process often turns up personal identifiable information, such as names. Worse, it\u2019s displayed as a banner in the pixel data, so, it becomes imperative to identify and mask this.<\/p>\n Time to unleash those Cognitive Services again. At this stage, you can use pre-built services to easily extract the biparietal diameter and head circumference measurements from your JPEG. These measurements let you calculate the gestational age.<\/p>\n A patient\u2019s personal data is often stored in the image\u2019s metadata, as well as the banner. And, in the interests of patient privacy, you\u2019ll need to thoroughly de-identify these images before sending them on for further processing.<\/p>\n For metadata de-identification, the reference value in the Tags is used to check the resulting OCR JSON payload. This information must be masked by identifying the coordinates, width, and height of the bounding box.<\/p>\n Now, deploy Cognitive Services once more to detect the bounding box around any personal information. Details like a name or Medical Record Number are masked by a rectangle automatically drawn around\u00a0 sensitive information.<\/p>\n Finally, this data needs to be available for interoperability. That\u2019s where you\u2019ll want to use \u00a0the Azure FHIR API<\/a> services. This lets the data to flow to the rest of the downstream analytics systems.<\/p>\n By taking advantage of low code\/no code services, the healthcare sector finds itself in a better position to innovate. No need for a huge capital investment to hire subject matter experts long before they\u2019re required.<\/p>\n These easy-to-deploy services are creating a new breed of devs, dubbed Citizen Developers; professionals who can now quickly create and automate a business workflow or a cumbersome form-based routine without needing complex coding skills.<\/p>\n By leveraging the power of AI and Azure cloud computing, cleaning, pre-processing, and de-identifying ultrasounds becomes quicker than ever before \u2013 and easier, too. Now feed the results into a machine learning system, helping healthcare professionals make quicker diagnoses and improve the all-round experience for patients and employees.<\/p>\nPicture the scene: machine learning and ultrasound<\/h2>\n
Picture the scene: preparing and cleaning the data<\/h2>\n
The low-code technology at play<\/h2>\n
Step 1 \u2013 Finding the measurements<\/h2>\n
Step 2 \u2013 Pixel extraction and conversion<\/h2>\n
Step 3 \u2013 Data extraction<\/h2>\n
Step 4 \u2013De-identifying the information<\/h2>\n
Step 5 \u2013 Automatically hide information<\/h2>\n
Step 6 \u2013 Ensuring interoperability<\/h2>\n
Low code Citizen Developers changing healthcare<\/h2>\n
Find out more<\/h2>\n