The Uncertainty of Human Knowing
We can never know all there is to know about anything — this is a fundamental ‘law’ of Nature. This is in fact is the only cause of mistakes. Ignorance is the word that best describes the human condition. Alfred Korzybski explained this condition scientifically as the Principle of Non-Allness. By this he meant that we humans make all of our decisions with incomplete and imperfect knowing. We make every choice without all the information. All humans live and act in state of ignorance. Korzybski felt that developing an awareness of this ‘law’ of Nature was so fundamentally important to all humans, that he developed a lesson especially for children. Korzybski would explain:
“Children, today we want to learn all about the apple.”
He would place an apple in view of the children, “Do you children know about the apple?”
“I do!”, “I do!”, “Yes, I know about apples!”
“Good” Korzybski moved to the blackboard. , “Come, tell me about the apple?”
“The Apple is a fruit.”, “The apple is red.”, “The apple grows on a tree.”
Korzybski would begin to list the characteristics described by the children on the blackboard.
The children continued, “An apple a day keeps the Doctor away.”
Korzybski continued listing the children’s answers until they run out of ideas, then he would ask, “Is that all we can say about the apple?
When the children answered in the affirmative, Korzybski would remove his pocket-knife and cut the apple in half, passing the parts among the children.
“Now, children can we say more about the apple?
“The apple smells good.” “The juices are sweet.” “The apple has seeds.” “Its pulp is white.” “Mother makes apple pie.
Finally when the children had again run out of answers, Korzybski would ask, “Now, is that all-we can say about the apple?” When the children agreed that it was all that could be said, he would again go into his pocket only this time he removed a ten power magnifying lens and passed it to the children. The children would examine the apple, and again respond:
“The apple pulp has a pattern and a structure.” “The skin of the apple has pores.” “The leaves have fuzz on them.” “The seeds have coats.”
Thus Korzybski would teach the children the lesson of Non-ALLness. Now we could continue to examine the apple—with a light microscope, x-ray crystallography, and eventually the electron microscope. We would continue to discover more to say about the apple. However, we can never know ALL there is to know about anything in Nature. We humans have the power to know about Nature, but not to know ALL. Knowing is without limit, but knowing is not total. Universe is our human model of Nature. Our ‘knowing’ can grow evermore complete. It can grow closer and closer to the ‘Truth’, but it cannot equal the ‘Truth’. It must always be incomplete.
The following essay is reposted from EDGE.
Learning to Expect the Unexpected
The 9/11 commission has drawn more attention for the testimony it has gathered than for the purpose it has set for itself. Today the commission will hear from Condoleezza Rice, national security adviser to President Bush, and her account of the administration’s policies before Sept. 11 is likely to differ from that of Richard Clarke, the president’s former counterterrorism chief, in most particulars except one: it will be disputed.
There is more than politics at work here, although politics explains a lot. The commission itself, with its mandate, may have compromised its report before it is even delivered. That mandate is “to provide a ‘full and complete accounting’ of the attacks of Sept. 11, 2001 and recommendations as to how to prevent such attacks in the future.”
It sounds uncontroversial, reasonable, even admirable, yet it contains at least three flaws that are common to most such inquiries into past events. To recognize those flaws, it is necessary to understand the concept of the “black swan.”
A black swan is an outlier, an event that lies beyond the realm of normal expectations. Most people expect all swans to be white because that’s what their experience tells them; a black swan is by definition a surprise. Nevertheless, people tend to concoct explanations for them after the fact, which makes them appear more predictable, and less random, than they are. Our minds are designed to retain, for efficient storage, past information that fits into a compressed narrative. This distortion, called the hindsight bias, prevents us from adequately learning from the past.
Black swans can have extreme effects: just a few explain almost everything, from the success of some ideas and religions to events in our personal lives. Moreover, their influence seems to have grown in the 20th century, while ordinary events — the ones we study and discuss and learn about in history or from the news — are becoming increasingly inconsequential.
Consider: How would an understanding of the world on June 27, 1914, have helped anyone guess what was to happen next? The rise of Hitler, the demise of the Soviet bloc, the spread of Islamic fundamentalism, the Internet bubble: not only were these events unpredictable, but anyone who correctly forecast any of them would have been deemed a lunatic (indeed, some were). This accusation of lunacy would have also applied to a correct prediction of the events of 9/11 — a black swan of the vicious variety.
A vicious black swan has an additional elusive property: its very unexpectedness helps create the conditions for it to occur. Had a terrorist attack been a conceivable risk on Sept. 10, 2001, it would likely not have happened. Jet fighters would have been on alert to intercept hijacked planes, airplanes would have had locks on their cockpit doors, airports would have carefully checked all passenger luggage. None of that happened, of course, until after 9/11.
Much of the research into humans’ risk-avoidance machinery shows that it is antiquated and unfit for the modern world; it is made to counter repeatable attacks and learn from specifics. If someone narrowly escapes being eaten by a tiger in a certain cave, then he learns to avoid that cave. Yet vicious black swans by definition do not repeat themselves. We cannot learn from them easily.
All of which brings us to the 9/11 commission. America will not have another chance to hold a first inquiry into 9/11. With its flawed mandate, however, the commission is in jeopardy of squandering this opportunity.
The first flaw is the error of excessive and naÔve specificity. By focusing on the details of the past event, we may be diverting attention from the question of how to prevent future tragedies, which are still abstract in our mind. To defend ourselves against black swans, general knowledge is a crucial first step.
The mandate is also a prime example of the phenomenon known as hindsight distortion. To paraphrase Kirkegaard, history runs forward but is seen backward. An investigation should avoid the mistake of overestimating cases of possible negligence, a chronic flaw of hindsight analyses. Unfortunately, the hearings show that the commission appears to be looking for precise and narrowly defined accountability.
Yet infinite vigilance is not possible. Negligence in any specific case needs to be compared with the normal rate of negligence for all possible events at the time of the tragedy — including those events that did not take place but could have. Before 9/11, the risk of terrorism was not as obvious as it seems today to a reasonable person in government (which is part of the reason 9/11 occurred). Therefore the government might have used its resources to protect against other risks — with invisible but perhaps effective results.
The third flaw is related. Our system of rewards is not adapted to black swans. We can set up rewards for activity that reduces the risk of certain measurable events, like cancer rates. But it is more difficult to reward the prevention (or even reduction) of a chain of bad events (war, for instance). Job-performance assessments in these matters are not just tricky, they may be biased in favor of measurable events. Sometimes, as any good manager knows, avoiding a certain outcome is an achievement.
The greatest flaw in the commission’s mandate, regrettably, mirrors one of the greatest flaws in modern society: it does not understand risk. The focus of the investigation should not be on how to avoid any specific black swan, for we don’t know where the next one is coming from. The focus should be on what general lessons can be learned from them. And the most important lesson may be that we should reward people, not ridicule them, for thinking the impossible. After a black swan like 9/11, we must look ahead, not in the rear-view mirror.
[Editor’s Note: First published as an Op-Ed Page article in The New York Times on April 8th.]
NASSIM NICHOLAS TALEB is an essayist and mathematical trader and the author of Dynamic Hedging and Fooled by Randomness (2nd Edition, published April 9th).