User: Booboo, Degrowth Defined →: Sustainability: Black swan theory - Wikipedia, the free encyclopedia

Black swan theory - Wikipedia, the free encyclopedia

The black swan theory or theory of black swan events is a metaphor that describes an event that comes as a surprise, has a major effect, and is often inappropriately rationalized after the fact with the benefit of hindsight.

The theory was developed by Nassim Nicholas Taleb to explain:

The disproportionate role of high-profile, hard-to-predict, and rare events that are beyond the realm of normal expectations in history, science, finance, and technology. The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities). The psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare event in historical affairs.

Unlike the earlier philosophical "black swan problem", the "black swan theory" refers only to unexpected events of large magnitude and consequence and their dominant role in history. Such events, considered extreme outliers, collectively play vastly larger roles than regular occurrences. More technically, in the scientific monograph Lectures on Probability and Risk in the Real World: Fat Tails (Volume 1), Taleb mathematically defines the black swan problem as "stemming from the use of degenerate metaprobability".[2]

Background[edit]

Black swan events were introduced by Nassim Nicholas Taleb in his 2001 book Fooled By Randomness, which concerned financial events. His 2007 book The Black Swan extended the metaphor to events outside of financial markets. Taleb regards almost all major scientific discoveries, historical events, and artistic accomplishments as "black swans"—undirected and unpredicted. He gives the rise of the Internet, the personal computer, World War I, dissolution of the Soviet Union, and the September 2001 attacks as examples of black swan events.

The phrase "black swan" derives from a Latin expression; its oldest known occurrence is the poet Juvenal's characterization of something being "rara avis in terris nigroque simillima cygno" ("a rare bird in the lands and very much like a black swan"; 6.165).[4] In English, when the phrase was coined, the black swan was presumed not to exist. The importance of the metaphor lies in its analogy to the fragility of any system of thought. A set of conclusions is potentially undone once any of its fundamental postulates is disproved. In this case, the observation of a single black swan would be the undoing of the logic of any system of thought, as well as any reasoning that followed from that underlying logic.

Juvenal's phrase was a common expression in 16th century London as a statement of impossibility. The London expression derives from the Old World presumption that all swans must be white because all historical records of swans reported that they had white feathers.[5] In that context, a black swan was impossible or at least nonexistent. After Dutch explorer Willem de Vlamingh discovered black swans in Western Australia in 1697,[6] the term metamorphosed to connote that a perceived impossibility might later be disproven. Taleb notes that in the 19th century John Stuart Mill used the black swan logical fallacy as a new term to identify falsification.[7]

Taleb asserts:[8]

What we call here a Black Swan (and capitalize it) is an event with the following three attributes.

First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme 'impact'. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.

I stop and summarize the triplet: rarity, extreme 'impact', and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.

Identifying a black swan event[edit]

Based on the author's criteria:

The event is a surprise (to the observer). The event has a major effect. After the first recorded instance of the event, it is rationalized by hindsight, as if it could have been expected; that is, the relevant data were available but unaccounted for in risk mitigation programs. The same is true for the personal perception by individuals. Coping with black swan events[edit]

The main idea in Taleb's book is not to attempt to predict black swan events, but to build robustness against negative ones that occur and be able to exploit positive ones. Taleb contends that banks and trading firms are very vulnerable to hazardous black swan events and are exposed to unpredictable losses. On the subject of business in particular, Taleb is highly critical of the widespread use of the normal distribution model as the basis for calculating risk. For example, a paper produced by academics from Oxford University and based on data from 1,471 IT projects showed that although the average cost overrun was only 27%, one in six of the projects had a cost overrun of 200% and a schedule overrun of almost 70%.[9]

In the second edition of The Black Swan, Taleb provides "Ten Principles for a Black-Swan-Robust Society".

Taleb states that a black swan event depends on the observer. For example, what may be a black swan surprise for a turkey is not a black swan surprise to its butcher; hence the objective should be to "avoid being the turkey" by identifying areas of vulnerability in order to "turn the Black Swans white".[11]

Epistemological approach[edit]

Taleb's black swan is different from the earlier philosophical versions of the problem, specifically in epistemology, as it concerns a phenomenon with specific empirical and statistical properties which he calls, "the fourth quadrant".

Taleb's problem is about epistemic limitations in some parts of the areas covered in decision making. These limitations are twofold: philosophical (mathematical) and empirical (human known epistemic biases). The philosophical problem is about the decrease in knowledge when it comes to rare events as these are not visible in past samples and therefore require a strong a priori, or an extrapolating theory; accordingly predictions of events depend more and more on theories when their probability is small. In the fourth quadrant, knowledge is uncertain and consequences are large, requiring more robustness.[citation needed]

According to Taleb,[13] thinkers who came before him who dealt with the notion of the improbable, such as Hume, Mill, and Popper focused on the problem of induction in logic, specifically, that of drawing general conclusions from specific observations. The central and unique attribute of Taleb's black swan event is high profile. His claim is that almost all consequential events in history come from the unexpected — yet humans later convince themselves that these events are explainable in hindsight.

One problem, labeled the ludic fallacy by Taleb, is the belief that the unstructured randomness found in life resembles the structured randomness found in games. This stems from the assumption that the unexpected may be predicted by extrapolating from variations in statistics based on past observations, especially when these statistics are presumed to represent samples from a normal distribution. These concerns often are highly relevant in financial markets, where major players sometimes assume normal distributions when using value at risk models, although market returns typically have fat tail distributions.[citation needed]

Taleb said "I don't particularly care about the usual. If you want to get an idea of a friend's temperament, ethics, and personal elegance, you need to look at him under the tests of severe circumstances, not under the regular rosy glow of daily life. Can you assess the danger a criminal poses by examining only what he does on an ordinary day? Can we understand health without considering wild diseases and epidemics? Indeed the normal is often irrelevant. Almost everything in social life is produced by rare but consequential shocks and jumps; all the while almost everything studied about social life focuses on the "normal," particularly with "bell curve" methods of inference that tell you close to nothing. Why? Because the bell curve ignores large deviations, cannot handle them, yet makes us confident that we have tamed uncertainty. Its nickname in this book is GIF, Great Intellectual Fraud."

More generally, decision theory, based on a fixed universe or a model of possible outcomes, ignores and minimizes the effect of events that are "outside model". For instance, a simple model of daily stock market returns may include extreme moves such as Black Monday (1987), but might not model the breakdown of markets following the 9/11 attacks. A fixed model considers the "known unknowns", but ignores the "unknown unknowns". The famous statement of Donald Rumsfeld[14] in a DoD press briefing 2002 is said to have been inspired by a presentation of Taleb in the DoD shortly before.[citation needed] Taleb's 2001 book Fooled By Randomness was about financial events, but had already introduced the Black Swan concept.[15][16][17]

Taleb notes that other distributions are not usable with precision, but often are more descriptive, such as the fractal, power law, or scalable distributions and that awareness of these might help to temper expectations.[18]

Beyond this, he emphasizes that many events simply are without precedent, undercutting the basis of this type of reasoning altogether.

Taleb also argues for the use of counterfactual reasoning when considering risk.[19][page needed][20]

See also[edit]Books by Taleb[edit]References[edit]
Jump up ^ http://www.fooledbyrandomness.com/FatTails.html Jump up ^ JSTOR 294875 Jump up ^ "Opacity". Fooled by randomness. Retrieved 2011-10-17.  Jump up ^ "Black Swan Unique to Western Australia", Parliament, AU: Curriculum, archived from the original on 2011-03-01 . Jump up ^ Hammond, Peter (October 2009), WERI Bulletin (1), UK: Warwick . Jump up ^ "The Black Swan: The Impact of the Highly Improbable". The New York Times. 22 April 2007.  Jump up ^ Bent Flyvbjerg & Alexander Budzier, September 2011, "Why Your IT Project May Be Riskier Than You Think", Harvard Business Review, vol 89, number 9, pp 601-603 Jump up ^ Webb, Allen (December 2008). "Taking improbable events seriously: An interview with the author of The Black Swan (Corporate Finance)" (Interview). McKinsey Quarterly. McKinsey. p. 3. Retrieved 23 May 2012. Taleb: In fact, I tried in The Black Swan to turn a lot of black swans white! That’s why I kept going on and on against financial theories, financial-risk managers, and people who do quantitative finance.  Jump up ^ Taleb, Nassim Nicholas (April 2007). The Black Swan: The Impact of the Highly Improbable (1st ed.). London: Penguin. p. 400. ISBN 1-84614045-5. Retrieved 23 May 2012.  Jump up ^ DoD News Briefing - Secretary Rumsfeld and Gen. Myer, February 12, 2002 11:30 AM EDT Jump up ^ Days that shook the world, Oliver Burkeman, book review in The Guardian 2007 Jump up ^ A Point of View: See no evil 10 January 2014 Jump up ^ Kursbuch 180: Nicht wissen (not knowing (sic!)), Armin Nassehi, Peter Felixberger Murmann Verlag DE, 02.12.2014 Jump up ^ Gelman, Andrew (April 2007). "Nassim Taleb’s "The Black Swan"". Statistical Modeling, Causal Inference, and Social Science. Columbia University. Retrieved 23 May 2012.  Jump up ^ Taleb, Nassim Nicholas (22 April 2007), "The Black Swan", The New York Times  Jump up ^ Gangahar, Anuj (16 April 2008). "Market Risk: Mispriced risk tests market faith in a prized formula". The Financial Times. New York. Archived from the original on 20 April 2008. Retrieved 23 May 2012. 
Bibliography[edit] Taleb, Nassim Nicholas (2010) [2007], The Black Swan: the impact of the highly improbable (2nd ed.), London: Penguin, ISBN 978-0-14103459-1, retrieved 23 May 2012 . ——— (September 2008), "The Fourth Quadrant: A Map of the Limits of Statistics", Third Culture (The Edge Foundation), retrieved 23 May 2012 . ————; Alexander Denev (January 2014), Portfolio Management under Stress, Cambridge University Press, retrieved 1 January 2014 . External links[edit]<img src="//en.wikipedia.org/wiki/Special:CentralAutoLogin/start?type=1x1" alt="" title="" width="1" height="1" style="border: none; position: absolute;" />

Comments

trishtrish
the SNB news? 1/15/15
...