In his office at the VA hospital in Seattle, Dr. Nadeem Zafar needed to settle a debate.
Zafar is a pathologist, the kind of doctor that carries out clinical lab tests on bodily fluids and tissues to diagnose conditions like cancer. It's a specialty that often operates behind the scenes, but it's a crucial backbone of medical care.
Late last year, Zafar's colleague consulted with him about a prostate cancer case. It was clear that the patient had cancer, but the two doctors disagreed about how severe it was. Zafar believed the cancer was more aggressive than his colleague did.
Zafar turned to his microscope – a canonically beloved tool in pathology that the doctors rely on to help make their diagnoses. But the device is no ordinary microscope. It's an artificial intelligence-powered microscope built by and the U.S. Department of Defense.
The pair ran the case through the special microscope, and Zafar was right. In seconds, the AI flagged the exact part of the tumor that Zafar believed was more aggressive. After the machine backed him up, Zafar said his colleague was convinced.
"He had a smile on his face, and he agreed with that," Zafar told CNBC in an interview. "This is the beauty of this technology, it's kind of an arbitrator of sorts."
The AI-powered tool is called an Augmented Reality Microscope, or ARM, and Google and the Department of Defense have been quietly working on it for years. The technology is still in its early days and is not actively being used to help diagnose patients yet, but initial research is promising, and officials say it could prove to be a useful tool for pathologists without easy access to a second opinion.
There are currently 13 ARMs in existence, and one is located at a Mitre facility just outside of Washington, D.C. Mitre is a nonprofit that works with government agencies to tackle big problems involving technology. Researchers there are working with the ARM to identify the vulnerabilities that could cause issues for pathologists in a clinical setting.
At first glance, the ARM looks a lot like a microscope that could be found in a high school biology classroom. The device is beige with a large eyepiece and a tray for examining traditional glass slides, but it's also connected to a boxy computer tower that houses the AI models.
When a glass slide is prepared and fixed under the microscope, the AI is able to outline where cancer is located. The outline appears as a bright green line that pathologists can see through their eyepiece and on a separate monitor. The AI also indicates how bad the cancer is, and generates a black and white heat map on the monitor that shows the boundary of the cancer in a pixelated form.
CNBC demoed the ARM with researchers at the Mitre facility in August.
Patrick Minot, a senior autonomous systems engineer at Mitre, said since the AI is overlaid directly onto the microscope's field of view, it doesn't interrupt the pathologists' established workflow.
The easy utility is an intentional design choice. In recent years, pathologists have been contending with workforce shortages, just like corners of health care. But pathologists' caseloads have also been mounting as the general population grows older.
It's a dangerous combination for the specialty. If pathologists are stretched too thin and miss something, it can have serious consequences for patients.
Several organizations have been trying to as a way to increase efficiency, but digital pathology comes with its own host of challenges. Digitizing a single slide can require of storage, so the infrastructure and costs associated with large-scale data collection can balloon quickly. For many smaller health systems, digitization is not yet worth the hassle.
The ARM is not meant to replace digital pathology systems, but Minot said it can help health organizations bypass the need for them. Pathologists have the option to take screen grabs of slides using ARM's software, for instance, which are much less expensive to store.
The ARM will usually cost health systems between $90,000 to $100,000.
Minot added that the ARM ensures the physical microscope, not just a computer, remains an integral part of the pathologists' process. Many have warned him not to mess with their microscopes, he joked.
Few understand the challenges facing pathologists quite like Dr. Niels Olson, the chief medical officer of the Defense Innovation Unit, or DIU, at the Department of Defense.
The DIU was created in 2015 as a way for the military to integrate cutting-edge technology developed by the commercial world. The organization negotiates contracts with companies so they can collaborate and circumvent long bureaucratic hang ups.
Olson is a pathologist, and before beginning his role at the DIU, he served in the U.S. Navy. In 2018, he was sent to Guam, a U.S. island territory in Micronesia, where he worked as the laboratory medical director and blood bank director in the Naval Hospital.
During his two years in Guam, Olson was one of two pathologists on the island, and the only pathologist in the Naval Hospital. This meant he was often making major decisions and diagnoses on his own.
"It's not just your job to say 'This is cancer, it's this kind of cancer.' Part of the job is saying 'It's absolutely not cancer,' and that can be nerve wracking when you're alone," Olson told CNBC in an interview. "I would have loved to have an Augmented Reality Microscope in Guam, just so there'd be somebody, something else helping."
The ARM is meant to serve as a second line of defense for pathologists, and Olson said it would not replace the doctors themselves. He added that the obvious initial use case for the microscope would be in smaller, remote labs, and it could also serve as a resource for pathology residents in training.
But Olson had dreamed up a tool like the ARM long before his time in Guam. On Aug. 10, 2016, while working as a resident in the Naval Medical Center in San Diego, Olson decided to message a connection he had at Google. In the email, which was viewed by CNBC, Olson described a rough idea of what a microscope like the ARM could be.
For a while, Olson said he heard nothing. But months later, he was standing in a Google office building in Mountain View, California, crammed in a locked room that only a few people at the company had access to. There, he watched as an early AI-powered microscope successfully identified cancer on a small set of slides he had brought with him.
Olson said the room was sweltering because everyone inside was so "pumped."
"I don't want to say it's quite like seeing your kid for the first time, but it was sort of like, this is awesome, this is gonna be a thing," Olson said.
Around the time he was sent to Guam, a product manager at the DIU came across Olson's research. The pair wrote together in 2019 about how the Department of Defense and Silicon Valley could work together to leverage AI. They said there are millions of patients enrolled in the federal government's health care systems, which means it boasts "the most comprehensive healthcare dataset in the world." That data has obvious commercial use.
"Big data is what Silicon Valley does best, and the potential for spillover into civilian healthcare systems is vast," they wrote.
Shortly thereafter, the DIU began looking for commercial partners to help build and test the ARM. The organization picked the optical technology company Jenoptik to handle the hardware, and after evaluating 39 companies, it selected Google to develop the software.
Aashima Gupta, global director of health care strategy and solutions at Google Cloud, said the company has since launched four algorithms for the ARM which can identify breast cancer, cervical cancer, prostate cancer and mitosis. The AI models are trained on data from the DIU, and Gupta said neither Google employees nor Google infrastructure have access to it.
"It's encrypted all the way," Gupta told CNBC in an interview. "From how the data is collected, how it is stored and how it is analyzed, and anything in between."
With the hardware and the software in order, the DIU has been carrying out initial research to test the ARM's efficacy.
In the fall of 2022, the organization published a in the Journal of Pathology Informatics. The paper found that the breast cancer AI algorithm performed reasonably well across a large domain of samples, but there are caveats, said David Jin, the lead author on the paper and the deputy director for AI assessment at the Department of Defense's Chief Digital and Artificial Intelligence Office.
The paper specifically examined how well the AI performed when detecting breast cancer metastasis in lymph nodes, and Jin said it did better on certain types of cells than others. He said the study is promising, but there's still a "huge" amount of rigorous testing to be done before it can support pathologists with real patient care.
"Something like this has an extreme potential for benefit, but also there's a lot of risks," as it would change how cancer diagnosis is done, Jin told CNBC in an interview.
Olson, who returned from Guam and began working at the DIU in 2020, is also listed as an author on the paper. He said independent assessments of the other three models, for prostate cancer, mitosis and cervical cancer, have not been carried out at the DIU yet.
Research with the ARM is ongoing, and the DIU is also soliciting feedback from organizations like Mitre and health systems like Veterans Affairs. There is work to be done, but since the DIU has validated the initial concept, the organization is beginning to think about how to scale the technology and collaborate with regulators.
The DIU negotiated agreements with Google and Jenoptik that will allow the technology to be distributed through the military and commercially. The DIU is hoping to make the ARM available to all government users through the General Services Administration website sometime this fall.
Zafar of VA Puget Sound said that ultimately, though the ARM will certainly aid pathologists, the general public will benefit most from the technology. He said the ARM's accuracy, speed and cost effectiveness will all contribute to better care.
"AI is here, and it's going to keep developing," Zafar said. "The point is not to be afraid of these technologies, but to triage them to the best use for our medical and health care needs."