Two and a half years ago, my friend Laura was experiencing pain and had an x-ray of her abdomen. The radiologist’s report determined it was most likely endometriosis. It made sense. She had a history.
But demonstrating an intuition that would repeat many times during her last two years, Laura decided to get a second opinion. Fortunately, she knew another radiologist.
He reached a very different conclusion: lymphoma. She died in July, nearly two years later to the day.
The first radiologist’s mistake — which he corrected on a second reading — didn’t impact the course of her disease, although it’s easy to imagine it could have. Mistakes do happen, and we hope, even assume, they’re just that — unusual mistakes.
But I found myself thinking of her experience while hearing a recent NPR report, “Why Even Radiologists Can Miss a Gorilla Hiding in Plain Sight.”
It seems radiologists — and humans in general — can be so focused on one thing, we completely overlook other things, even something as unusual as a man in a gorilla suit.
There’s a famous experiment of this involving people passing basketballs, but a curious Harvard Medical School attention researcher, Trafton Drew, adapted it to determine whether radiologists might be so focused on a search for cancer that they would overlook another, major issue.
To test that, he added a picture of a man in a gorilla suit to their slides — literally — to see if even these highly trained medical observers would succumb to the Invisible Gorilla Effect.
His finding? Eighty-three percent of the radiologists missed it.
That bothered me, obviously, and made me wonder if that’s what happened with Laura’s first reading. But what was even more troubling was this response posted on the Radiology Daily: “Radiologists Miss Gorilla in Lung; So What?”
It included this quote by science writer Virginia Hughes:
“Let’s say more of the experts had noticed the gorilla. That would necessarily mean that they weren’t as focused on the task they were instructed to perform: find the cancer.”
Talk about missing the point. Turn the tables — what if they’re so busy looking for pneumonia, scar tissue, or, say, endometriosis, that they miss the cancer?
So what does any of this have to do with anything I normally write about?
SAP’s Jonathan Becher pointed out the broader implications in a recent Forbes article:
“The confirmation that expert observers suffer from inattentional blindness raises some troubling questions. By training radiologists to identify white nodules, are they more likely to miss other life-threatening anomalies? Could the same issues pertain to other expert observers like MRI technicians, air traffic controllers, and police officers?”
It’s time we recognized the Invisible Gorilla Effect in how we train professional observers, and to my mind, that’s got to include those who are analyzing data, and in particular, Big Data.
In some ways, Big Data may be less susceptible to this problem — at least, right now. Many experts already recommend that organizations examine large datasets, and particularly unstructured data, without an agenda, but more like an anthropologist: to see what turns up that’s unique, unusual.
But there are also those who say you should have a very specific question you’re trying to answer with Big Data.
Honestly, that’s the approach most likely to win out, simply because it’s the one approach that can be pursued as a pilot or in small, low-hanging fruit projects.
These are the type of projects more likely to miss the invisible gorilla.
And then there’s the whole question of algorithms, which are increasingly the behind-the-scenes force in data and the world at large. We’re creating code that uses algorithms many of us don’t even understand, according to this TedGlobal talk by Kevin Slavin. And some of that algorithm-based code is wrong.
Is it possible we’re not just overlooking the gorillas, but hard-coding it into our lives?
To quote Becher, “That’s one scary gorilla.”
I don’t know the answer, but I think it’s high time to recognize our blind spots and adjust accordingly.