- Google’s artificial intelligence team has developed an algorithm that’s like “spell check” for pathologists, the doctors responsible for diagnosing cancer patients through images of their cells.
- In two papers published Friday, Google found that its algorithm complemented what the pathologists were able to pick up from the images in terms of determining how much patients’ cancers had spread in their lymph node tissue.
- “This represents a demonstration that people can work really well with AI algorithms than either one alone,” Yun Liu, a member of the Google AI team and an author on the papers, told Business Insider.
Google is developing a tool to help doctors diagnose breast cancer using artificial intelligence.
The tool, known as LYmph Node Assistant, or LYNA, could one day act as a sort of “spell check” for pathologists, the doctors responsible for diagnosing cancer patients through images of their cells.
To train the algorithm, Google used a de-identified dataset based on scans of breast cancer patients’ lymph nodes from medical centres in the Netherlands.
Tissue taken from the lymph nodes can be a way of detecting whether a patients’ breast cancer has spread beyond the breasts. Pathologists then look at tissue samples from breast cancer patients’ lymph nodes to get a sense of how much that particular patient’s tumour has spread, and how aggressive the cancer might be.
On Friday, Google released two papers that were published in the journals Archives of Pathology and Laboratory Medicine and The American Journal of Surgical Pathology. The first paper set out to show that the algorithm could be used to pick up cancer cells on the tissue images it was presented. In addition to looking at the slides from the Netherlands, the algorithm also had to look at 108 slides from another laboratory.
The second paper, which stacked up how the algorithm performed with pathologists as well as on its own, found that it was important for pathologists and the algorithm to work in tandem. Both papers found that the algorithm could detect which slides had metastatic cancer and which ones didn’t 99% of the time.
Think of it like spell-check on a computer. You might miss a word that’s jumbled some of its letters or is missing an “e,” but the algorithm can catch that. On the flip side, if there’s a word you typed twice, the algorithm might not do as well catching the error.
What’s interesting is that researchers found the pathologists who were given the tool performed better than both pathologists who didn’t get the tool and the tool used on its own to pick up cancerous cells on an image.
Here’s what the tool looks like for pathologists.
“This represents a demonstration that people can work really well with AI algorithms than either one alone,” Yun Liu, a member of the Google AI team and an author on the papers told Business Insider.
This isn’t the first time Google’s parent company Alphabet has applied artificial intelligence to healthcare. Through DeepMind, Alphabet’s artificial intelligence company, it’s been working to identify diseases by looking at images of patients’ eyes.
And Google itself has been working on an algorithm to diagnose diabetic retinopathy, a type of eye disease found in patients with diabetes.
Next, the team will have to see how the algorithm works in the clinic – that is, when the algorithm is given new scans of breast cancer patients’ lymph nodes. Should that go well, it could one day become part of the process doctors use to understand a patient’s breast cancer diagnosis.
Business Insider Emails & Alerts
Site highlights each day to your inbox.
Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.