x
Black Bar Banner 1
x

Watch this space. The new Chief Engineer is getting up to speed

An AI Told Me I Had Cancer

Posted by Otto Knotzer on March 23, 2023 - 10:12am

An AI Told Me I Had Cancer

Why was an AI looking through my medical records and how did it work? I decided to find out.

 

  •  

  •  

  •  

  •  

This story is adapted from More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech, by Meredith Broussard.

IN LATE 2019, I went in for what I thought was a routine mammogram. The radiologist reading my images told me there was an area of concern and that I should schedule a diagnostic ultrasound. At the ultrasound appointment a few days later, the tech lingered on an area of my left breast, and frowned at the screen. I knew then it would be bad. Another mammogram and several doctor visits later, it was certain: I had breast cancer.

More Than a Glitch cover

COURTESY OF THE MIT PRESS

 Buy This Book At:

If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more.

Everybody freaks out when given a cancer diagnosis, but exactly how you freak out depends on your personality. My own coping mechanism involves trying to learn absolutely everything I can about my condition. And, because I think the poor user interface design of electronic medical record systems can lead to communication problems among medical professionals, I always poke around in my online medical chart. Attached to my mammography report from the hospital was a strange note: “This film was read by Dr. Soandso, and also by an AI.” An AI was reading my films? I hadn’t agreed to that. What was its diagnosis?

I had an upcoming appointment for a second opinion, and I figured I’d ask what the AI found. “Why did an AI read my films?” I asked the surgeon the next day.

“What a waste of time,” said the surgeon. They actually snorted, thinking the idea was so absurd. “Your cancer is visible to the naked eye. There is no point in having an AI read it.” They waved at the nearby computer screen, which showed the inside of my breast. The black and white image showed a semicircle on a black background filled with spidery ducts, with a bright white barbell marking the spot of my diagnostic biopsy. The cancerous area looked like a bunch of blobs to me. I felt grateful that this doctor was so expert and so eagle-eyed that they could spot a deadly growth in a sea of blobs. This was why I was going to a trained professional. I immediately decided this was the surgeon for me, and I signed a form agreeing to an eight-hour operation.

The doctors and nurses and staff who cared for me were fantastic. They were skilled at their jobs, and thoroughly professional. My cancer experience could have been terrifying, but instead it was manageable. Fast-forward a few months, and I was mercifully cancer-free and mostly recovered. I got a clean bill of health one year out, and because I was still curious about the AI that read my films, I decided to investigate what was really going on with breast cancer AI detection.

I had found out about the cancer detection AI because I was nosy and read the fine print. Patients today often don’t know that AI systems are involved in their care. This brings up the issue of full consent. Few people read the medical consent agreements that we are required to sign before treatment, much like few people read the terms and services agreements required to set up an account on a website. Not everyone is going to be thrilled that their data is being used behind the scenes to train AI or that algorithms instead of humans are guiding medical care decisions. “I think that patients will find out that we are using these approaches,” said Justin Sanders, a palliative care physician at Dana-Farber Cancer Institute and Brigham and Women’s Hospital in Boston, to StatNews. “It has the potential to become an unnecessary distraction and undermine trust in what we’re trying to do in ways that are probably avoidable.”

FEATURED VIDEO

Penn Badgley Answers the Web's Most Searched Questions

MOST POPULAR

ADVERTISEMENT

I wondered if an AI would agree with my doctor. My doctor had saved me from an early grave; would an AI also detect my cancer? I devised an experiment: I would take the code from one of the many open-source breast cancer detection AIs, run my own scans through it, and see if it detected my cancer. In scientific terms, this is what’s known as a replication study, where a scientist replicates another scientist’s work to validate that the results hold up.

ADVERTISEMENT

I HAD A colleague in the data science department who was building a breast cancer detection AI that had impressive published results. I decided to suppress any feelings of weirdness about talking to a colleague about breasts, and run my own medical images through my colleague’s breast cancer detection code to investigate exactly what the AI would diagnose. (His name is Krzysztof Geras, and the code for the AI accompanied his 2018 paper “High-Resolution Breast Cancer Screening with Multi-View Deep Convolutional Neural Networks.”). 

My plan went off the rails immediately.

I saw my scans in my electronic medical record (EMR). I tried to download them. I got an error. I tried to download the scans with the data anonymized, per the options. The EMR offered me a download labeled with someone else’s name. I couldn’t check if the images were mine or this other person’s, because the download package didn’t have the necessary files to open the package on a Mac, which was my primary computer. 

After a few days, I concluded that the download code was broken. I called the hospital, which put me in touch with tech support for the portal system. I got on the phone with tech support, escalated to the highest level, and nobody was interested in fixing the code or investigating. They offered to send me a CD of the images. “I don’t have a CD drive,” I told the friendly person in tech support. “Nobody has a CD drive anymore. How is there not an effective download method?”

See What’s Nex