Published by The Random House Publishing Group [US] and Allen Lane [UK] (2007); Penguin Books (2008)
Nassim Nicholas Taleb doesn’t shy away from voicing his opinions, even when they lead to controversy. One example is his recent spat with historian Mary Beard in relation to a BBC ‘educational’ cartoon that displayed a sub-Saharan man as being representative of diversity in Roman Britain. Beard claimed the BBC’s portrayal was “accurate” – Taleb begged to differ. He is a no-nonsense sceptical empiricist; a mathematical trader, essayist and philosopher who is willing to go against supposedly reputable narratives and challenge preconceptions. He also names and shames those responsible for propagating bad ideas. He has much to say about how we view the world. His perspective is one that can change your outlook.
In the 1980s Taleb worked in quantitative finance, but as a trader and ‘quant’ he did the opposite of what most quants do and instead of accepting the standard mathematical models applied to uncertainty, he looked for their flaws and limits, seeking to take advantage of rare events that others said couldn’t happen. On Black Monday, 19 October 1987, when the stock markets around the world crashed, he saw vindication in his approach. The blindness of other traders and quants, so confident of the impeccable credentials of their mathematical models, meant that when the crash came, they were flummoxed: they hadn’t anticipated such an occurrence. Taleb had, or at least he foresaw the possibility of such a rare event, if not the specific crash itself.
“I was convinced that I was totally incompetent in predicting market prices – but that others were generally incompetent also but did not know it, or did not know that they were taking massive risks.”
That brings us to the focus of the book: Taleb takes the view that these rare events, or ‘Black Swans’, not just in financial matters but in life generally, are unpredictable but we should adapt to them rather than attempting to predict them or fooling ourselves by following models that assume they won’t happen just because it seems neat and comforting to believe this.
The book opens by explaining what ‘Black Swans’ symbolise. Before Australia was discovered, people in the Old World held the unshakeable opinion that only white existed because they were all that had been observed. Yet it was later discovered that black swans did exist, forcing people to change their views. Taleb states:
“It illustrates a severe limitation to our learning from observations or experience and the fragility of our knowledge. One single observation can invalidate a general statement derived from millennia of confirmatory sightings of millions of white swans.”
Random events might be difficult, or impossible, to directly predict; they can cause major turmoil (financial crashes, natural disasters) or can bring spectacular successes (a ‘surprise’ bestselling novel, soaring stock valuations). Taleb doesn’t claim that we can know in advance the specifics of these events but the point is that we should be aware that they can happen so as to be prepared to take advantage of them or to minimise loss, depending on whether they are respectively good or bad.
The first part of the book looks at how we seek validation. Taleb believes the human mind suffers from three ailments, or a ‘triplet of opacity, when it comes into contact with history: a) the “illusion of understanding” where people think they know what is going on in a world that is more random than they realise; b) retrospective distortion, where matters are assessed after the fact; c) overvaluation of factual information, where ‘learned’ people ‘Platonify’ i.e. seek to categorise, simplify and make everything neat and well-defined – categorising can be important “but it also becomes pathological when the category is seen as definitive, preventing people from considering the fuzziness of boundaries, let alone revising categories” and the process reduces true complexity.
Not only do Black Swan events lie outside of our expectations and carry extreme impacts but also, in spite of being outliers, human nature seeks to concoct explanations for them after the fact. Taleb warns against the “narrative fallacy” which many historians are guilty of, i.e. applying a narrative to events and assuming that they were inevitable, that they had to happen the way they did.
Why is behind this defective reasoning? Taleb notes the research conducted by the likes of Daniel Kahneman, Amos Tversky and others in assessing how the human brain processes information. Kahneman won a Nobel Prize for his ideas in 2002 and his book Thinking Fast and Slow, published in 2011, goes into them in more depth (it is a work similar to Taleb’s in that it challenges our traditional view of the world) but as Taleb summarises it, the brain process information using two systems: System 1, which is quick and automatic (we might also call it intuitive); and System 2, which is slower, logical but more effortful. It is natural for us to rely heavily upon System 1 because it is easier and requires less exertion though it might hark back to days when humans had to make snap decisions about fleeing from wild animals, so it is risk averse and emotional too. We become so used to it that we often don’t apply System 2. There can be good reasons for this: as Kahneman notes in his book, System 2 uses more energy so it cannot be used constantly; also, it is of little use for making quick decisions. We often think of things in fast terms because it is easier and sometimes necessary whilst enabling us to function in many of the day-to-day tasks where slower thought would be wasteful but as Taleb notes, the danger is that “we are not introspective enough to realise that we understand what is going on a little less than warranted from a dispassionate observation of our experiences”.
The narrative fallacy causes us to seek quick and easy explanations that often don’t reflect the reality of what occurred. It is part of a human tendency to seek explanations and identifiable causes, grabbing the most apparent ones as explanations, making it is easier for us to cope with what has happened. Taleb doesn’t deny that causes exist; he simply notes that one should “be suspicious of the “because” and handle it with care” especially if you suspect silent evidence might be involved. He illustrates this using a story told by Cicero:
“One Diagoras, a nonbeliever in the gods, was shown painted tablets bearing the portraits of some worshippers who prayed, the survived a subsequent shipwreck. The implication was that praying protects you from drowning. Diagoras asked, “Where are the pictures of those who prayed, then drowned?””
This distortion is in essence a bias. Taleb further explores the idea using Giacomo Casanova. Casanova lived a seemingly lucky life and when he was in difficulty he always managed to find the help of a new patron or former lover who took pity. Some people would consider him lucky (he also seemed to) but as Taleb notes, cemeteries are full of Casanovas i.e. all those who lived similarly but were ultimately unsuccessful. His survival wasn’t down so much luck as the fact that he scraped through whereas many other people in similar positions did not. By the law of averages someone presumably had to. It is those who survive who are more likely to think they are invincible and of course they get to tell their stories and be remembered.
The second part of the book moves on to examine the human failure to predict. He discusses the 1965 ‘discovery’ by two radio astronomers mounting a large antenna at Bell Labs in New Jersey that had an important impact on big bang theory. A background hiss could not be removed, even when they cleaned bird excrement from the dish, thinking that was the problem. It took the scientists time to work out that the sound was cosmic microwave radiation that traced the birth of the universe. The equipment had picked up something important the astronomers didn’t initially understand. They received great credit for their ‘discovery’ even though it was something inadvertent, realised by chance. He also references Alexander Fleming’s discovery of the antibacterial properties of penicillin as another famous example of a highly important yet inadvertent discovery. These instances show the unplanned nature of Black Swans – if people knew they would discover such things, they would plan for them and thus they would not be Black Swans.
Taleb feels it important to highlight the ‘ludic fallacy’, ludic taken from the Latin ‘ludus’ meaning games. He points out that people too often view chance in terms of games, like throws of dice. But he notes that in games the odds are computable; in real life they are not. People get the wrong ideas about casinos, thinking that they are liable to Black Swans when in fact casino bosses are very calculating in the risks they allow and take measures to prevent anyone from winning too much, even to the point of flying in at their own expense ‘whales’ who will control the game and swing several million dollars in one bout (easy for a successful casino to cover) and this is acceptable as it prevents big losses to lucky outsiders. Gambling is “sterilised and domesticated uncertainty”.
The difficulty of prediction in real life can be seen in the fact that sometimes people who focus less on detailed information are highly successful. He uses the example of Aristotle Onassis, who managed to succeed whilst showing very little interest in technical details of business. He got up late, socialised, pursued beautiful women and did not apply rigorous technical analysis to his business dealings. Yet he was a charmer, a dealmaker and ran a shipping empire. One could see an analogy with Donald Trump, who Taleb is a fan of. The ability to delegate could also be added into the mix. But the point is that someone doesn’t necessarily need a lot of detailed information to succeed in what they do; indeed too much information can in fact be a bad thing.
When theories become stuck in people’s minds they are difficult to remove. Taleb cites a 1965 study of clinical psychologists by Stuart Oskamp that showed when they were given additional information about patients they didn’t improve their diagnostic abilities but instead became more deeply entrenched in their views. The same principle applied to a study of bookmakers, whose predictions did not improve when more information about performances was made available to them. Their confidence, however, did rise. This leads us on to the problem with ‘experts’.
One of the most interesting themes of the book concerns what Taleb calls “epistemic arrogance”, something any of us can be guilty of, i.e. being overconfident about how much we think we know. But this problem is particularly pronounced amongst ‘experts’ whose views we are often expected to accept without question. This isn’t to say there aren’t people who should be regarded as competent authorities in certain professions (e.g. chess masters, physicists, test pilots) but in other areas the ‘experts’ tend to be less competent (e.g. stockbrokers, psychiatrists, intelligence analysts). As Taleb notes, it is perfectly valid to question an expert’s error rate. He references the work of psychologist Philip Tetlock – he studied various political and economic experts who were asked to judge the likelihood of a number of events occurring within five years. There were a total of 27,000 predictions. The study showed that the error rates of experts were many times what they had estimated. There was little difference between them in terms of levels of education, from undergraduates to PhDs, from journalists to professors. But “those who had a big reputation were worse predictors than those who had none”. This might bring a knowing smile to people who voted Leave in the United Kingdom referendum on European Union membership, given the relentlessly gloomy predictions of ‘experts’ cited by Remainers which have failed to come to pass – Leavers are still sneered at when they ignore or question such people.
The author describes himself as a ‘sceptical empiricist’ and prefers to base his views on practice and seeking what lies outside of the Platonic fold (the point at which neat, categorised, Platonic representations come into contact with reality). He does not advocate outright scepticism – this can lead to extreme withdrawal. Rather his principal aim is “not to be a sucker in things that matter”. It is not that people should dodge risks or ‘avoid crossing the street’ as some have tried to suggest is Taleb’s view – instead he simply wants to ensure that we don’t walk across the street blindfolded.
It is important that we aren’t complacent in assuming things will continue as they always have. Taleb demonstrates this with the example of a turkey, well fed and looked after for 1000 days. Said bird could well assume this treatment will continue but then on day 1001 a nasty surprise occurs and it becomes someone’s Thanksgiving dinner. This highlights the problem of induction i.e. where general conclusions are drawn from specific instances. He notes that absence of evidence is not evidence of absence and although people can focus on certain information and find many confirmatory instances that they think back up their assertions, this doesn’t mean there isn’t that one Black Swan lurking to shatter their thinking. We tend to look for instances that confirm our biases and versions of events.
To illustrate the falsehood of applying methods that assume certainties to situations they aren’t suited to, Taleb puts forward the idea of two places called Mediocristan and Extremistan. In some areas of life will not be subject to Black Swans – there are limitations; these areas are non-scalable i.e. there is a distinct range that will not change, so things are more even and one instance form the outer edge of the range will not upset the average: this is Mediocristan. In another area, however, the difference in range could be massive; if there is an upper limit it is so high that it might as well not exist, it is scalable and is somewhere the winner takes all: this is Extremistan.
In the third part of the books Taleb voices his disdain for the use of the bell curve (or Gaussian distribution, named after Carl Friedrich Gauss, although he didn’t invent the idea). This is often used by ‘experts’ to make projections that end up being staggeringly wrong, yet they never seem to learn. The truth often only leads to cognitive dissonance. Even those who take the point will say ‘it’s better than nothing’. As Taleb says, the bell curve can be useful in Mediocristan, where there is little real variation and the largest example of something won’t affect the average, so Black Swans are not going to occur. But in Extremistan where Black Swans can and do happen, the bell curve is unsound:
“Measures of uncertainty that are based on the bell curve simply disregard the possibility, and the impact, of sharp jumps or discontinuities and are, therefore, inapplicable in Extremistan.”
Bell curves suck the randomness out of life, which plays to the natural human tendency to seek certainties. They average out information, e.g. human height (no person is e.g. 100 feet tall) so no random instance from either end of the scale will be too far from the norm to upset the average. But if bell curves are applied to e.g. currency fluctuations in the same manner, they are utterly useless in predicting the impact of randomness because just one random occurrence can shake up the whole system. In spite of this, the bell curve continues to be used as a risk-management tool by regulators and central bankers.
This isn’t to say that uncertainties can’t be tackled – Benoît Mandelbrot, a mentor of Taleb, managed to produce a method using ‘fractals’, where small parts resemble, to some degree, the whole, whether it be in nature or in mathematics and computer models. These can be helpful (Grey Swans as Taleb calls them) although they don’t provide certainties – only Gaussian bell curve models do that and if wrongly applied, the results can be dire.
Adolphe Quételet was the statistician responsible for seeking to use such techniques to establish a ‘standard human’ or ‘average man’. He initially restricted this to physical characteristics but then expanded into social matters. His thinking influenced Karl Marx, who cites him in Das Kapital and states that “Societal deviations in terms of the distribution of wealth, for example, must be minimised”. Divergence from the mean or median was treated as an error, or abnormal. One can draw one’s own conclusions about the impact of such an idea when disseminated in philosophies that took inspiration from Marx.
Even Nobel Prize winners like Myron Scholes and Robert C Merton have fallen into the Gaussian trap and have thus been scorned by Taleb, much to their ire. They helped to create a risk analysis model for a company, Long-Term Capital Management, or LTCM, a collection of people with impressive résumés hailed far and wide for seemingly sophisticated calculations. In the summer of 1998 a Black Swan occurred – a series of events that lay outside LTCM’s models – triggered by a Russian financial crisis, which saw the company go bust. This should have led to the sorts of theories responsible going the same way, yet they continued to be taught in business schools as if faultless.
Although The Black Swan received much critical praise upon its initial release in 2007, Taleb’s message wasn’t initially taken seriously by many. The 2008 financial crisis, however, caused an upsurge in interest in his ideas, making him seem prophetic, though he didn’t specifically predict that there would be a crisis at that time but did hint at it, noting the problems that Fannie Mae bank was having and the levels of bureaucracy and interrelation within the globalised banking system, giving the appearance of stability but in reality meaning that “when one falls, they all fall”. As he states, there might be fewer failures under such globalised systems, but when they do come, the shock waves are felt far and wide.
The book has humorous moments, with Taleb aiming some sly digs at self-important academic types in particular and using some colourful characters, both real and fictional, to illustrate his points. It is also quirky and he can seemingly go off on a tangent or mention certain expressions and concepts before he has fully explained them, which might prove problematic for some readers to handle. He uses terms (such as Gaussian, Mandelbrotian) that will be uncommon to readers unfamiliar with the ideas he references and he invents his own terms to describe certain concepts (for instance Mediocristan, Extremistan, epistemocracy, ludic fallacy). Although explanations can be found in the text, they are not always clear when first mentioned, so a degree of focus and patience is required before the meaning behind some of his ideas becomes apparent but it does justify the effort. Taleb was accused by one reviewer of being facetious (although said reviewer was a journalist for The Guardian, so again, draw your own conclusions). He is aware of his own faults though – he comes across as irreverent (which he freely admits) concerning people whose views he dislikes and yet can also be fawning (again, by his own admission) about those who he admires. Some might see him as arrogant or haughty in his dismissal of certain views whilst emphasising the correctness of his own but what is clear is that he simply doesn’t suffer fools gladly. You might not agree with everything he says but in many respects the truth of what he is saying is difficult to argue with and if you give him the opportunity, he can change your perspective on life for the better.
© The Black Swan 2018
Amazon book link: