Psychologist Daniel Kahneman is considered by many in my profession be the father of behavioral economics. In 2002 he was awarded the Nobel Prize in Economic Sciences “for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty.”
His bestselling book Thinking, Fast and Slow summarizes decades of research, and challenges the assumption of human rationality still prevalent today in many economic theories.
And yet in many ways he is still part of the same rationalistic tradition that believes that when solving complex problems, a detached, unbiased, calculated approach to decision making always leads to the best outcomes.
Intuition and mental shortcuts called heuristics, are the quick and easy way we make decisions in what Kahneman calls System 1 thinking. This is the place where we make decisions by the seat of our pants; emotionally; using our gut. System 2 on the other hand is where reasoning happens; where we make rational decisions; where we slow down, and take a calculated approach.
The problem, according to Kahneman, happens when we try to use System 1 to solve a System 2 problem.
Take the problem “2 + 2,” or “25 x 10.” We can easily solve these kinds of problems without ever reaching for a calculator. It’s a simple, System 1 problem.
Now try “164 divided by π, times the square root of 56.” Here we know well enough to use System 2 and so we do reach for that calculator.
But watch what happens when we mistake a complex System 2 problem for a simple System 1 solution:
Kahneman’s famous example is this:
“A bat and a ball together cost $1.10. The bat costs $1 more than the ball.”
How much does the ball cost?
Most instinctively say “10 cents” – because a dollar-ten separates quite naturally into $1 and 10 cents; and 10 cents seems about right. But it’s not. The right answer takes more calculation. The question, is more complex.
Ultimately the lesson here is clear. When it comes to complex decision making: Listen to your gut, and you will almost always be wrong.
Gerd Gigerenzer thinks this is the wrong lesson. Gigerenzer authored Gut Feelings: The Intelligence of the Unconscious and Rationality for Mortals: How People Cope with Uncertainty. He is director emeritus of the Center for Adaptive Behavior and Cognition at the Max Plank Institute for Human Development. And he’s long been critical of Kahneman’s work.
For example, before applying Kahneman’s System 1 or System 2 – Gigerenzer wants us to first apply this more fundamental, or ontological question:
“Does the problem – that you’re trying to solve – actually, have a solution?”
Because according to Gigerenzer, under conditions of uncertainty, complex problems are better tackled by professionals using trained instinct and expert intuition – than is available by taking an unbiased, detached, rational, calculated, reasoned approach.
The outcomes are simply better.
Answerable problems may well be the province of logic, reason and calculation. The domain of knowledge and information. And fit squarely in the wheelhouse of computers, big data and artificial intelligence. But when it comes to questions that are unanswerable…
The problem with problems is that our clients don’t merely have problems – they have lives!
Lives filed with all the contingency, and uncertainty, and unknown unknowns that
are just part of being, a human being, in the 21st century.
The vast majority of our decisions are not likely to ever become deliberate acts of will. But that doesn’t mean they have to be unreflective acts of irrationality. The good news is it seems we humans have a unique capacity to deal with life’s uncertainty; its unpredictability; even its unknown unknowns.
Besides, if it’s the gut that betrays why bother looking elsewhere? Why not retrain the gut?
Because whether we’re talking about future stock market returns, or the future of climate change and the human race – there are no right answers, only better results.