“Has that always been there?” my husband asks, noting some new structure on a familiar route.
“I think so,” I say, “but honestly I don’t know for certain. It could’ve been there a long time.”
“It’s odd we didn’t notice it until now,” he remarks, which could be true, but maybe not. It is amazing what we can overlook when we’re distracted or focused elsewhere.
I’ve thought about this a great deal over the past few months as I have experienced some health challenges and wondered about the tests being prescribed, what they reveal, and what my doctors may or may not be missing. It’s not a frivolous exercise.
According to How Doctors Think (Groopman, 2007), our physicians are often pushedto see so many patients and consider so much complex data that it is notuncommon for them to engage in something known as “anchoring.” Anchoring refers to “a shortcut in thinkingwhere a person doesn’t consider multiple possibilities but quickly and firmlylatches on to a single one, sure that he has thrown his anchor down just wherehe needs to be” (p. 65).
My point here is not to critique modern medicine. If anything, I’m even more in awe of the number and complexity of decisions physicians have to make each day. Rather, Groopman’s perspective has challenged me to consider how we all have a tendency toward these same ways of thinking.
Charles Duhigg, Pulitzer Prize-winning investigative reporter, makes an excellent case for this in Smarter Faster Better. He describes “cognitive tunneling” in his account of the crash of Air France Flight 447 in route from Brazil to France in2009. You may recall it crashed into the ocean without warning about four and a half hours after takeoff. It took nearly two years to find the site of the plane wreckage, but the black box eventually revealed pilot error as the most likely cause of the crash, specifically, cognitive tunneling on the part of the pilots. Cognitive tunneling refers to “a mental glitch that sometimes occurs when our brains are forced to transition abruptly from relaxed automation to panicked attention” (p. 77).
Duhigg suggests that when we are in the grips of an emergency or crisis, we are most likely to focus on whatever is directly in front of us, ignoring other important information. Although the pilots had all the information they required to save the plane in front of them, they appeared to ignore it in favor of one screen of information. Unfortunately, that one screen was insufficient and actually led them astray.
It troubles me to think I could be missing something important when I’ve been helping a client or trying to solve a difficult problem. And yet I know it’s possible. I wonder, “Did he mention those details earlier? How did I overlook that particular dynamic in an earlier session? Have I minimized the impact of her anxiety?”
What kept me for recognizing another perspective? Perhaps it was because I was so focused on one aspect of the problem, I failed to notice important details.
There are many ways to jog ourselves out of cognitive ruts. Curious (Leslie,2014) and the authors mentioned above offer some strategies. Given the number of complicated issues confronting us each day – at home, at work and in our community, it might be helpful to recognize our tendency to “anchor” and “tunnel.” Life seems easier when we indulge in this sort of reasoning, but for complex problems, these kinds of thinking are rarely sufficient.