One of the greatest risks facing consumers is our lack of imagination. We have real difficulty in imagining what calamities lurk around the next bend. Psychologists know this as a shortcut that humans use to make speedy evaluations. We’re wired not to waste time wondering if we’re in mortal danger – so we just run! This type of response may have served our ancestors well on the savannah but such lack of resourcefulness is troubling in modern times.
We can often spend a great deal of time trying to determine whether or not what we already know for a “fact” is actually the case. But if the facts change, what else can we do but evolve with them? John Maynard Keynes, the macroeconomist, famously dismissed longstanding theory by declaring; “When the facts change, I change my mind. What do you do?” Among other things Tom Sawyer is attributed as saying, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”
It is a really useful exercise to reconsider the beliefs that we hold dear – one or more of them could be wrong. Scientists and mathematicians may have it easy. Their truths are black or white; correct or incorrect. They regularly theorise about concepts but when their conviction is proved incorrect they have little difficulty reassessing their conjectures. Not so with the rest of us. We stubbornly entrench our thinking as if it is a significant weakness to be wrong – or to change our minds.
The human brain is well adapted to respond to some risk but it’s not skilled at changing its view. The memory pathways, once formed, are difficult to untangle. We know these difficulties as biases – once ensconced, biases are hard to shift. If initially we run at the first sign of danger it is not easy to sit in contemplation the second time – even where the danger is from a video of a poisonous snake and not from the rapturous reptile itself. The fear of being wrong strengthens our memory bias. We’ve a tendency to exaggerate the risk of shocking, infrequent events and underestimate the perils of ordinary, daily life. This can lead to bad decision-making; like cancelling a flight after hearing of a plane crash. There are more hazardous risks to consider in life than travelling by plane.
Risk management theory should be more concerned with such emotional responses. We are after all dealing with frail humans. Failure to recognise risks and the more common shortcoming, admitting that we got it wrong, are human flaws. Mathematicians can help us analyse the probability of risk occurrences but most eyes glaze over. Here’s an example from a multiple choice test that are nowadays popular with examiners.
There are 4 questions with 4 possible answers to each. What are the chances of randomly guessing the correct answers? 1/4 x 1/4 x 1/4 x 1/4 = 1/256. Or, more pertinently, your risk of being wrong!