The current financial crisis provides opportunity for a lot of lessons to be learned. An article today by David Brooks in the New York Times is a good example. He writes about the behavioral assumptions upon which economic and social science disciplines are based.
Roughly speaking, there are four steps to every decision. First, you perceive a situation. Then you think of possible courses of action. Then you calculate which course is in your best interest. Then you take the action.
Over the past few centuries, public policy analysts have assumed that step three is the most important. Economic models and entire social science disciplines are premised on the assumption that people are mostly engaged in rationally calculating and maximizing their self-interest.
But during this financial crisis, that way of thinking has failed spectacularly. …
… perhaps this will be the moment when we alter our view of decision-making. Perhaps this will be the moment when we shift our focus from step three, rational calculation, to step one, perception.
I find that there is a striking parallel in my personal and corporate (church) experience. In recent years it has become more apparent to me that I, likewise, have religiously (pun intended) focused on step three as the most critical step in the decision making process. There was little emphasis on step one. I assumed unbiased vision and understanding which in turn resulted in faulty assumptions which obviously subverted steps two and three. Brooks says it this way:
Perceiving a situation seems, at first glimpse, like a remarkably simple operation. You just look and see what’s around. But the operation that seems most simple is actually the most complex, it’s just that most of the action takes place below the level of awareness. Looking at and perceiving the world is an active process of meaning-making that shapes and biases the rest of the decision-making chain.
Brooks cites Nassim Nicholas Taleb’s description of the perceptual biases that distort our thinking:
- our tendency to see data that confirm our prejudices more vividly than data that contradict them;
- our tendency to overvalue recent events when anticipating future possibilities;
- our tendency to spin concurring facts into a single causal narrative;
- our tendency to applaud our own supposed skill in circumstances when we’ve actually benefited from dumb luck.
… looking at the financial crisis, it is easy to see dozens of errors of perception. Traders misperceived the possibility of rare events. They got caught in social contagions and reinforced each other’s risk assessments. They failed to perceive how tightly linked global networks can transform small events into big disasters.He [Taleb] subscribes to what he calls the tragic vision of humankind, which “believes in the existence of inherent limitations and flaws in the way we think and act and requires an acknowledgement of this fact as a basis for any individual and collective action.” If recent events don’t underline this worldview, nothing will.This meltdown is not just a financial event, but also a cultural one. It’s a big, whopping reminder that the human mind is continually trying to perceive things that aren’t true, and not perceiving them takes enormous effort.