Augmented Reality Microscope To Detect Cancer Cells

by May 10, 2018

The power of Artificial Intelligence has unleashed exponentially under a range of major-to-minor technologies. In 2017, we witnessed a medical breakthrough with AI in identifying arrhythmias – irregular heart rhythms. In April 2018, Google announced that a team of their researchers have developed a microscope that conducts real-time diagnosis of cancer. To tackle a disease that is the second-most leading cause of death after cardiovascular disease, Google has come up with a tool to help pathologists spot cancer cells.


How Does It Work?

They remodeled a standard optical microscope by integrating augmented reality with a deep learning neural network. They describe this tool as an ‘Augmented Reality Microscope’ aka ARM.

The augmented reality (AR) tech is fused with the optics inside the microscope. Once the biopsy (small tissue removal from a person’s body to detect the presence of a disease) is conducted by surgeons, the tissue is brought to the pathology department to detect the presence of cancer cells. A standard way of detection is examining the tissue through a microscope. But in ARM, a neural network is integrated for cancer detection. So in a way, the eyepiece is the eye, the AR is the 3D image captured by the eye and the network plays a role of the brain.

For a deeper understanding of the ARM’s functioning you need to read the following –

The pathologist views the sample tissue through the eyepiece to capture the FoV (Field of View). If you observe the figure above, the viewer sees the projections (on the AR display) of the actual sample superimposed on the original image as opposed to a digitized one – thus seeing an ‘augmented reality’ of the specimen to be studied. Now the AI tech (neural network) shows a high-quality diagnostic performance.

The neural network is basically developed for tumor detection. The output of the network is a heatmap depicting the likelihood of cancer at each pixel location. This output is displayed as an overlap on the sample at real-time. This projection of the deep learning predictions into the optics has high contrast thus making it clearly visible on top of the tissue sample. Even first time users (pathologists) found the process to be quick and non-tedious to work with.

Google has published a paper on the ARM. In the abstract of the paper, the authors have mentioned how AI has been shown to improve diagnostic accuracy and perform translational research.


Why Is This Better Than Standard Pathology Work?

•  For one, pathologists have agreed that the regular way is tedious and time-consuming. In some cases, the standard approach collapses their capability for efficient assessment.

•  As a bonus, the equipment cost is actually low.

•  The ARM keeps pace with rapid and semicontinuous movements of the slide under view.

•  The system has low latency i.e. lower delay in forming an output.

•  By leveraging the latest AI algorithms, the ARM has the potential to improve speed, quality and accuracy of quantitative tissue assessments in medical practices.


All Hail Augmented Reality!

Cancer research has always been a top priority to the healthcare world. Overpowering the unfortunate-and-sometimes-catastrophic disease has been a tough fight for mankind since centuries. It now seems that we’re ARM-ed well with tech on our side. With the ARM by Google, the study & application of Augmented Reality is being driven into a brilliant and hopeful direction!