The super-confident, doctor-as-god types do not always perform well. One study of radiologists, for example, reveals that those who perform poorly on diagnostic tests are also those most confident in their diagnostic prowess.
People are loath to challenge experts. In a 2009 experiment carried out at Emory University, a group of adults was asked to make a decision while contemplating an expert’s claims, in this case, a financial expert. A functional M.R.I. scanner gauged their brain activity as they did so. The results were extraordinary: when confronted with the expert, it was as if the independent decision-making parts of many subjects’ brains pretty much switched off. They simply ceded their power to decide to the expert.
If we are to control our own destinies, we have to switch our brains back on and come to our medical consultations with plenty of research done, able to use the relevant jargon. If we can’t do this ourselves we need to identify someone in our social or family network who can do so on our behalf.
Anxiety, stress and fear — emotions that are part and parcel of serious illness — can distort our choices. Stress makes us prone to tunnel vision, less likely to take in the information we need. Anxiety makes us more risk-averse than we would be regularly and more deferential.
We need to know how we are feeling. Mindfully acknowledging our feelings serves as an “emotional thermostat” that recalibrates our decision making. It’s not that we can’t be anxious, it’s that we need to acknowledge to ourselves that we are.
It is also crucial to ask probing questions not only of the experts but of ourselves. This is because we bring into our decision-making process flaws and errors of our own. All of us show bias when it comes to what information we take in. We typically focus on anything that agrees with the outcome we want.
We need to be aware of our natural born optimism, for that harms good decision making, too. The neuroscientist Tali Sharot conducted a study in which she asked volunteers what they believed the chances were of various unpleasant events’ occurring — events like being robbed or developing Parkinson’s disease. She then told them what the real chances of such an event happening actually were. What she discovered was fascinating. When the volunteers were given information that was better than they hoped or expected — say, for example, that the risk of complications in surgery was only 10 percent when they thought it was 30 percent — they adjusted closer to the new risk percentages presented. But if it was worse, they tended to ignore this new information.
This could explain why smokers often persist with smoking despite the overwhelming evidence that it’s bad for them. If their unconscious belief is that they won’t get lung cancer, for every warning from an antismoking campaigner, their brain is giving a lot more weight to that story of the 99-year-old lady who smokes 50 cigarettes a day but is still going strong.
We need to acknowledge our tendency to incorrectly process challenging news and actively push ourselves to hear the bad as well as the good. It felt great when I stumbled across information that implied I didn’t need any serious treatment at all. When we find data that supports our hopes we appear to get a dopamine rush similar to the one we get if we eat chocolate, have sex or fall in love. But it’s often information that challenges our existing opinions or wishful desires that yields the greatest insights. I was lucky that my boyfriend alerted me to my most dopamine-drugged moments. The dangerous allure of the information we want to hear is something we need to be more vigilant about, in the medical consulting room and beyond.