You know how kids get a crush on someone, how for a time, their every thought and feeling is enlivened by the uncanny existence of the Object of Desire? Reluctantly, I admit this describes my intellectual life. I live for those moments when I discover a new mind, one that illuminates a facet of the world I have not previously been able to bring into focus. I read, I listen, I obsess over my new crush’s thoughts. I have the feeling that integrating them will change my mind, the way a new couch or table refocuses an entire room. I crave that change, and at the same time, I don’t want my new crush to displace the beloved old crushes lounging in the cozy armchairs of my awareness.
Meet Nassim Taleb, a thinker who just might clash with the furniture. Since I discovered him a few days ago, I’ve been downloading, ordering books and listening to podcasts. Go to EconTalk for the best podcast, Taleb’s Web site for links to most everything, and Wikipedia for an overview.
Here are all the caveats: this is a supremely confident (I forbear to say arrogant because I like his mind so much), elite personage whose interest in financial markets I don’t really share, whose politics appear to be of the Olympianlibertarian variety and whose grasp of the language of mathematics so far exceeds mine it might as well be Greek (or Lebanese, as he is by birth). But I promise you, the underlying ideas are well worth the journey.
What Taleb has already given me are much better reasons than my own instincts to do two things I’ve been advocating loud and long: distrust predictions and question theories. One of his main rhetorical devices is to imagine two realms.
Mediocristan is “the province dominated by the mediocre, with few extreme successes or failures. No single observation can meaningfully affect the aggregate. The bell curve is grounded in Mediocristan.” For example, concrete physical realities with limited ranges, like human height or weight, express Mediocristan. If you sample a thousand human beings, plotting their heights, you’ll get a result that looks very much like a bell curve, with most people clustered within a close range, and few that are markedly shorter or taller than that range. Within that sample, the presence of a few little people or basketball players won’t significantly affect the average, the median or the whole.
Extremistan is “the province where the total can be conceivably impacted by a single observation.” For example, there is no intrinsic limitation on income. Randomly choose a thousand humans from the poorest nation to the richest: if Bill Gates is included in the mix, he will significantly throw off the average, the median, and the whole. His presence creates
a complexity very different from the regularity of Mediocristan.
Taleb argues convincingly that we treat far too much of our reality as if it were Mediocristan when in fact much of it often behaves like Extremistan, where there are occasional “black swans” (his name for the unexpected event and the title of his most recent book) among the white. So, for example, out of the many thousands of books, films and recordings released each year, a small number will account for the largest part of sales, and it is not possible to predict with certainty which of the many works released will find black swan-style success (or failure). Indeed, in any endeavor susceptible to notable, unpredictable exceptions, no amount of examining the past will enable us to foretell the future.
What’s going on here? Taleb discusses many factors contributing to our tendency to see our world as Mediocristan. There is the fact that our brains evolved long ago to deal with a world with many fewer variables, much less organized information, and a vastly smaller number of theories to explain them. The more complex any given situation, the larger number of examples you need to understand what is happening there. For instance, sampling the sales of a few dozen
published books each year won’t tell you much about the prospects of the thousands of others not sampled. It’s just as likely as not that your sample would include one or more black swans—unexpectedly huge winners or losers—so anything you might conclude based on it would not be generalizable to the rest.
Yet it is plain to see that we have a powerful (one might say inbuilt) desire to figure things out, and we like it best when they fall into stable, understandable and predictable patterns. So over and over, we surrender to that desire, generalizing on the basis of too little information and coming up wrong.
If I place a few winning bets, I might conclude I am a skilled gambler, even though all that actually happened was a short run of luck. So many Hollywood careers have that trajectory: an early black swan of success is chased by financiers wanting to invest in the next blockbuster, only the second film is a plain old white swan and the one after that an ugly duckling, and all the investors are puzzled as to how they could have been mistaken…until the next black swan comes along.
I love what Taleb has to say about inventions, how almost all of the discoveries that have had tremendous impact on our culture were accidents in the sense that they were discovered while searching for something else. Because of hindsight bias, he says, histories of economic life and scientific discoveries are written with straightforward story lines: someone set out to do something and succeeded, it’s all about intention and design. But in truth, “most of what people were looking for, they did not find. Most of what they found they were not looking for.” Penicillin was just some mold inhibiting the growth of another lab culture; lasers at first had no application but were thought to be useful as a form of radar; the Internet was conceived as a military network; and despite massive National Cancer Institute-funded cancer research, the most potent treatment—chemotherapy—was discovered as a side-effect of mustard gas in warfare (people who were exposed to it had very low white blood cell counts). Look at today’s biggest medical moneymakers: Viagra was devised to treat heart disease and high blood pressure.
It’s interesting to think about this in career or relationship terms, realms full of complex human variables. Taleb points out that tons of books and gurus are based on asking the successful to explain how they got there. Typically, big winners in both business and love say it took good ideas and lots of hard work. But these are just stories people generate out of the need to explain, because many big losers also had good ideas and worked their butts off, with the opposite result.
This is so commonsensical that it ought to be obvious, but as Taleb says, we suffer so badly from the “confirmation error” (looking for information to confirm a foregone conclusion or belief system), we are thrilled to the point of stupidity when someone publishes a book or otherwise propounds an idea that confirms our hunches.
Guilty as charged. There’s no denying that Taleb confirms some of my pet observations. For example, isn’t the following entry from his glossary delicious?
Empty-suit problem (or “expert problem”): Some professionals have no differential abilities from the rest of the population, but for some reason, and against their empirical records, are believed to be experts: clinical psychologists, academic economists, risk “experts,” statisticians, political analysts, financial “experts,” military analysts, CEOs, et cetera. They dress up their expertise in beautiful language, jargon, mathematics, and often wear expensive suits.
Many such experts have made their reputations by giving retrospective explanations for events, often delivered in the type of neat theoretical package that satisfies the desire for a story confirming our beliefs or reinforcing our sense of security. But being able to make up a good story after the fact is meaningless; the only thing that can count as true understanding, that can truly test a theory, is accurate prediction, and there we have fallen far short of success.
Mostly, I think, our theorizing is useless at best, dangerous at worst. How many times have you had the following experience? You become aware of two different physical symptoms at the same time, say a headache and a rash. In the privacy of your own head, you develop a hypothesis that links them: the same naughty bacterium is nibbling at your brain and your epidermis. Then you go to the doctor and are surprised (and relieved) to learn the rash is poison oak and the headache too much wine.
Just so in the news. Because we love things to make sense, because we are always on the lookout for correlations (and willing to settle for cleverly packaged coincidences in their place), any study that seems to satisfy these desires gets column-inches, but there’s no room to print the fact that a dozen other teams of researchers looked at the same phenomena without turning up a meaningful correlation. For every scientific experiment or medical study that produces a startling (if short-lived) conclusion that feeds our desire for orderly sense, there are countless studies that generate inconclusive or negative results.
Lately, I have been thinking about a problem that Taleb alludes to in even the bit of his thinking I’ve already read and heard. We are surrounded by a gargantuan news- and information-generating apparatus. Its appetite is enormous, as it must poop out vast quantities of airtime and newsprint every day. Consequently, we have story after story about results
that turn out to be irreproducible (every week bringing its news flash of soon-to-be supplanted miracle cures and diets, instance) and a glut of theories as to why the economy works (or doesn’t), how to reduce crime, how to improve education, and so on. We broadcast a whole flock of black swans every evening—uncanny accidents, rare occurrences, terrible risks gone wrong—which normalizes them in our minds, so that our estimation of their likelihood is amazingly skewed. From what I can see, this glut is making us less and less able to cope.
Taleb’s sense of our problem is that we do not know how much we don’t know. He has a challenging task in drawing useful advice out of uncertainty. I’m looking forward to reading his books, but in the meantime, I am entertaining a few ideas his work seems to suggest:
Since we can’t control unpredictable events, we should accept uncertainty and seek to maximize our exposure to serendipity, as by putting ourselves in the way of new ideas.
Since there is such danger in accepting conclusions based on too little information simply because they confirm our beliefs, we should try to remain aware in the present of what we are doing, paying attention to what actually happens and refraining as far as possible from imposing theories on our experience.
We should recognize our poor record as a species in predicting the future, that we are much better at doing than knowing. Some things are more predictable than others: we are safe enough in expecting tomorrow’s sunrise to plan on breakfast. We can start noticing which situations are most susceptible to black swans, and when we encounter them, remember how little we truly know so our ignorance doesn’t lead us around by the nose.
I hear my old crushes—Paul Goodman, Paulo Freire, Isaiah Berlin—grumbling a little at having to move their armchairs back to make room for this upstart. But really, he fits right in. Goodman wrote eloquently about the slavishness expressed in our devotion to experts; Berlin rejected most theorizing about human beings as “grotesque,” and Freire made us understand the disabling effects of allowing certain ideas about ourselves to dominate our minds. In truth, they seem very happy to meet Nassim Taleb, another uncolonized mind. And so am I.
This entry was posted on Wednesday, May 9th, 2007 at 11:56 pm and is filed under Reading, listening & viewing.