Google Releases AI Tool for Pathology

by | Apr 15, 2024 | AI, Digital Pathology | 0 comments

Google has released a free cloud-based tool (“Path Foundation”) that 
transforms digitized slide images into numerical data that researchers 
can use to create AI tools for pathology. “This is a landmark for us,” says 
David Steiner, MD, PhD, Clinical Research Scientist at Google. Based on  feedback to Path Foundation, Google will make decisions toward  developing commercial AI tools for pathologists. 

For more than seven years, Google has employed a team of clinical research scientists based in Mountain View, California and London, England, tasked with developing AI software tools for radiology and pathology. Below is a summary of LE’s interview with Google’s David Steiner, MD, PhD, Clinical Research Scientist.

How can labs and pathologists use Path Foundation?

Commercial labs and academic medical centers, for example, can use Path Foundation to convert patches of their digitized slide images into numerical vectors, known as embeddings. These embeddings capture important features and patterns contained in digitized slide images which are learned by Path Foundation during its training on millions of pathology images. After the embeddings for each image are collected, this data can be used to train and create custom algorithms for a range of tasks such as identifying tissue type, tumors, or performing quality assurance on digitized images.

What’s the benefit of using a “Foundation” model to analyze content in images?
Leveraging the embeddings from Path Foundation requires less data and computational resources than traditional methods, giving researchers a big head start toward developing their own algorithms.

In contrast, traditional methods such as strongly supervised deep learning models require more resources because every task requires fresh training in its own unique deep learning model and many labeled images for each category of interest.

The foundation model is similar to that of seasoned guitar player quickly learning a new song by ear. Because the guitar player has already built up a foundation of skill and understanding, they can quickly pick up the patterns and groove of a new song.

Where did you get the digitized images to train Path Foundation on?
We used 20,000 whole-slide images, covering millions of image patches, from a variety of sources, including the National Cancer Institute’s Cancer Genome Atlas, as well as academic medical centers and some private pathology labs.

What is the user roadmap to getting started with Path Foundation?
After filling out the access form, users can run a small demo notebook that walks them through how to train a tumor classifier. To use Path Foundation on their own task (i.e., identifying tissue type) they would need to collate a set of digital pathology images with accompanying labels. The user would then upload these images and labels into Google Cloud.

From there, the users can adapt the demo notebook to call the Path Foundation API on their uploaded images. Using the existing code in the demo notebook they would then train a model to classify their images based on the labels and evaluate its performance on a held-out part of
the dataset not used to train the model. In the future we hope to make it even easier to use Path Foundation and the embeddings directly in your pathology slide viewer with no coding required.

We’re offering Path Foundation free to users on github.com/Google-Health.

Do labs that use Path Foundation need to share their digitized images or the results of their research with Google?
They don’t need to share the images or the results of their research with Google. They do need to have their images stored in Google Cloud (in their own private or institution account), but these are kept private to the user and not accessible by Google. In addition, when the images are sent to the Path Foundation model to compute the embeddings, they are not saved or stored by Google.

Any plans to develop commercial AI software tools like Ibex, Paige or PathAI?
Yes. Path Foundation represents a landmark toward that end.

What is the next step for your research team in terms of developing AI tools for pathologists?
We’ll take feedback on users from Path Foundation to understand key-use cases and how to make the tool better and easier to use. We’ll also explore how these embeddings might be used with large language and large multimodal models (LLMs & LMMs). And then we plan to develop useful approaches and models for working with whole slide images (WSIs), in addition to the “patch-based” models and applications that this current tool represents.

Has Google completed any studies related to AI-assisted diagnostic tools for cancer?
Yes. We have published several studies in peer-reviewed journals.

Most recently, we published a study that used AI to predict immunotherapy outcomes from digitized slide images in non-small cell lung cancer (Cancer Research, vol. 84, 2024).

In 2023, we published a paper that showed how AI can be used for clinical decision-making in colorectal cancer (Nature Communications Medicine, vol. 3, 2023). And, in 2022, we published a study that used AI for diagnosis and Gleason Grading of prostate cancer (Nature Medicine, vol. 28, 2022).

We have also published studies focusing on AI models for breast cancer.

Will AI algorithms eventually replace pathologists?
AI will make pathologists better rather than replace them. Initially, AI will be used to automate repetitive tasks such as locating the image patches that pathologists should focus their eyes on. Eventually, AI could be used to query images. Pathologists and researchers may someday be able to
type in specific questions and get AI answers about an image.

Is there the potential to integrate pathology and radiology image data using AI?
I’m excited to explore this and do see the promise in bringing these two specialties together. I believe the combination will yield more than the sum of the parts. Google does have a team of research scientists working on AI tools for radiology. And we did introduce an AI tool for chest x-rays (CXR Foundation) in July 2022.