Thursday, March 19, 2009

The Black Swan

For the past several days I've been absorbed in the non-fiction book The Black Swan, by Nassim Nicholas Taleb. This book, which had been recommended to me by at least half a dozen people, was a New York Times bestseller when it debuted two years ago. Taleb is a Wharton-school graduate who made a lot of money in the stock market and who is a devotee of world history.

The book, which is subtitled "The Impact of the Highly Improbable," has this as its premise: The bullet that hits you is never the one you see coming. All right, that's a simplification. What Taleb does is categorize all the reasons that we misjudge probabilities, plus the huge cost of those misjudgments when an unexpected event (a "black swan") suddenly appears. Taleb believes that it's the unexpected that shapes history, the financial markets, and much of our everyday disasters. What you don't see gets left out of the your predictions, and what you don't see matters hugely.

Two examples, the first borrowed from Bertram Russell: a turkey gets fed every day by humans for a thousand days. The turkey is "justified by the data" in concluding that humans reliably feed turkeys. The thousand-and-first day is the day before Thanksgiving and the turkey gets the axe. This, Taleb argues, is what happens to financial markets, and thus why all the "trend data" in the world results in professionals doing no better than amateurs at picking stocks (a phenomenon confirmed by experiments).

A second example: Humans invest too heavily in post-event explanations of causality (which may or may not be true; Taleb isn't a big believer in rational causality on the grounds that most humans aren't very rational). We then apply this post-event analysis to the future, and so again don't see coming the totally unexpected random event. As a result, Taleb distrusts nearly all "experts" -- they know too much past data, and so are blinded by their own expectations of order rather than seeing the true randomness of life.

I am only half way through the book, so I haven't yet come to Taleb's suggestions about all this (warning: MATH AHEAD). But I am very intrigued so far, partly because my own life often looks so affected by randomness (for instance, I never planned on becoming a writer). Has anybody else read The Black Swan? Thoughts?


TheOFloinn said...

I've read his Fooled by Randomness, which covers the same ground. It became repetitive after a while; and I decided that Taleb is a snot. He is correct, mind you; but he's snotty about it. He also misuses the term "random."

Nothing he said was unfamiliar to a practicing quality engineer. In fact, I might go further than he does, since we distinguish between tail events and special-cause events due to disturbances to the system. That is, when you throw the dice and they come up 12, that may be a rare event, but it was predictable that it would happen sooner or later. When it will happen is up in the air, but that it will happen is not. A 13, however, is inherently unpredictable and depends on nonrandom causes.

He misuses the term "random." Random variation is predictable, in that the overall pattern of results is predictable even when each individual result is not. That's how casinos and insurance companies make money.

If Congress passes a law making 12's double-plus ungood and punishing those who throw 12's; but a 12 will still pop up about once in 36 throws. Only the record gets falsified. You cannot treat the tail of a distribution as if it were a special cause. To get rid of the 12s you need to redesign the dice. But be careful: if the system is at all complex, making random changes to it will usually make things worse.

Taleb warned about the mortgage bubble. So did Jane Jacobs and the Wall Street Journal. The latter had been doing so for years. When I was consulting with a player in that market I was astonished to learn that they had never heard of Failure Modes and Effects Analysis, something in common use in aerospace since the 1950s.

Jake Freivald said...

Like Mike, I read Fooled by Randomness. I'm also an engineer by training, although not anymore and never a good one.

I thought the book was a good introduction for the layman to lots of things that would normally be out of one's grasp. It's also a good reminder, even for the mathematically inclined, about lots of true but not-perfectly-intuitive things.

It's the kind of thing that should be read alongside Karl Popper's Conjectures and Refutations to infuse a little humility into our thought processes.

Overall, I'd recommend it.

bluesman miike Lindner said...

I haven't read the book (which is still a big seller where I work, B&N Lincoln Triangle), so perhaps I shouldn't comment. But I've got a big yapping trap, so let it flap.

How can anyone criticize decision-making unless the decider has all the information needed to make a "rational" decision? Clearly, no one does and no one ever will. The Universe is too mighty, our minds too limited, and our lives too short. We look at what's happened, and extrapolate. And hope what we =want= to happen will come to pass. And a lot of the time, it does. Witness our technical civilization, which published scholar Taleb's book.

Sorry if my ignorance does discredit to the Enlightenment Civilization...

Which, in any case, died at the Battle of the Somme in 1914. :-(

CryptoFrenetic said...

There were a couple of things in the book that really sang to me. The first was the notion that most of our modeling (of just about anything you can imagine) is based on the assumption that everything falls within a "normal", gaussian or bell-shaped distribution. But the reality is that most things in life do not follow that type of randomness. Instead, life tends to follow more of a power rule, where the result you get today has an affect on the result you get tomorrow. That's how "bubbles" are formed.

If you understand this, then you understand why the models continually fail. But the amazing thing is that knowing the models don't work, the "experts" continue to use them as though they do!

Certainly, casinos can use the normal distribution to their advantage because the type of randomness in the gambling system is designed to be independent. If I roll the dice and they come up 12, then that will not affect my chances of rolling a 12 again on the next roll. The two rolls are independent.

But what about life insurance companies? If my neighbor dies, does it change the odds that I will die? If he died in a car accident, probably not. But if he died of a communicable disease, then he might have increased my risk. If he infects 5 people, and those 5 another 5 and so on; then the risk (to the insurance company) no longer looks like a normal/random distribution. In fact, whenever the next major pandemic hits (and history tells us we're overdue), then the life insurance industry will go belly-up pretty quickly.

The only way to build an appropriate model is to understand the underlying function and for most things in life, we don't have a clue what that might look like. Assuming an incorrect model is "good enough" is worse than no model at all, because it lulls you into believing that nothing exists outside the model even when examples to the contrary abound, and those examples have a profound impact.

TheOFloinn said...

Normal is okay if the measured result of the sum of many independent variables, no one of which is dominant. It will also be the case when you deal with the averages of sufficiently large samples -- and they don't have to be too large. If you're dealing with the product of many variables, you get a lognormal; with a polynomial combination of variables, then the extreme value distribution. What Taleb calls a "power rule" is actually serial correlation. The climate is a good example.

The real problem imho are the tails. Even a reasonable normal case will not be well-modeled in the tails by the Gaussian, because it goes to infinity in both directions and no real world process does. That means the probabilities of extreme values will most often be wrong, and the extremes are where most of our problems arise. (Think: "hundred year flood")

Put not they faith in models, at least not for extreme events. And you have to know all the important Xs, too.

cd said...

I'm hoping it means one of the Nigerian emails is for real, and if I send my bank info to all of them one will be a black swan and I'll get that princely treasure!

James A. Ritchie said...

While totally unexpected things certainly happen, it seems to me most things that get us in trouble on a societal level are only unexpected to the experts.

Black Swans are now flying everywhere, and have become an excuse for failure in every field there is. Throwing dice may tell you the odds of the big picture, but it's the pixels that make up the pixture we should be concerned with.

I have no doubt Black Swans exist, but I also have no doubt that true Black Swans are rare as chicken teeth. Math is a wonderful tool, and can be highly useful in looking at some things, but math sucks when it comes to individual people.

Unknown said...

Yes, I read Black Swan but not the previous book, so it was new ground for me. I was impressed by his ideas about how technology changes artistic and intellectual enterprises. Technology, he claims, allows more and more people to see fewer and fewer artists more and more often. Thus the rewards for art and intellect are increasingly narrowed down even as the profits for those fewer people rise.

He also believes that the most useful people in the world are those who prevent disasters, but a non-disaster isn't news, so we don't pay much attention to them.

He also explains that while lots of people on Wall Street knew the bubble would burst someday, no one could move out. If you cashed out to be safe while the bubble was growing, your stockholders would complain that they weren't making as much money as those with the trader still in the bubble. They might even leave your mutual fund for a fund still in the bubble to keep making money. So the traders felt stuck by their backers, and the backers now blame the traders for not getting out.