62 pages • 2 hours read
Kathryn SchulzA modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.
Anton’s Syndrome is a disorder that belongs to a group of neurological problems known as anosognosia—the denial of disease. Anton’s Syndrome specifically refers to a condition in which blind people do not know they are blind. The author uses this condition to demonstrate that error knows no limits and that any form of knowledge—despite its centrality or unassailability—can fail us.
The “bias blind spot” is the psychological term for a specific asymmetry in human reasoning: If we want to discredit a belief, we will argue that others hold that belief only because it benefits them, whereas if we wish to champion a belief, we will argue for its veracity. This helps explain the reason we dismiss the self-serving facets of our own beliefs while being quick to detect them in others’.
Coherencing is the process by which the brain automatically corrects our vision despite our blind spot—the part of the eye where the optic nerve goes through the retina. Coherencing thus creates a kind of visual illusion. The author uses coherencing as an example of the unconscious perceptual processes that mask the fallibility of our perception.
Confabulation occurs when the brain essentially makes something up without knowing it is doing so. The author uses split-brained patients and Anton’s Syndrome to illustrate confabulation. These patients are not dishonest, nor are they experiencing delusions; they are simply unaware that what they are saying is untrue, demonstrating the human mind’s tendency to fill in gaps where we lack knowledge.
The confirmation bias is the tendency to seek out and give more weight to evidence that already confirms our beliefs. We thus do not neutrally assess evidence but rather do so in light of theories we have already formed based off earlier evidence.
This is the author’s term for a person’s limited options to explain why they believe the things they do. This constraint’s stipulations include that it only applies to the beliefs we currently hold, rather than our prior beliefs; that it only applies to specific beliefs, not to our general beliefs; and that it applies only to our own beliefs, not the beliefs of others. The ’Cuz It’s True Constraint helps explain why we are so convinced of the veracity of our own beliefs but do not feel the need to extend this assumption to others.
The disagreement deficit is a problem the author says plagues all of us as members of communities, and it has four parts: Our communities disproportionately expose us to information that supports our own ideas; these communities shield us from outsiders’ disagreement; they cause us to ignore the contrary information we do encounter from outside the group; and they squelch disagreement from within the communities.
The Ebbinghaus Curve of Forgetting is a graph that plots with predictability the erosion of our memories’ accuracy. It undermines the flashbulb memory theory in its demonstration that we forget surprising and traumatic events with as much regularity as mundane events. The author uses this to illustrate the fallibility of human memory.
This psychology term refers to the phenomenon whereby we do not realize our mistakes because our errors are necessarily beyond our perception. That is, the reason error is possible is that while it is occurring, we are oblivious to it. The author states that this phenomenon is why we are constantly surprised by our own mistakes.
Error studies is an applied science and refers to a field with practitioners ranging from psychologists to economists to engineers. These practitioners’ goal is to divide and classify error in order to reduce the likelihood and potential impact of future mistakes.
The Evil Assumption is the phenomenon wherein we assume that those who disagree with us are neither ignorant of the evidence nor unable to comprehend it but that they willfully choose to deny it. This is an example of how we make hasty and potentially unfair judgements about others’ belief systems.
A flashbulb memory is an exceptionally vivid and detailed autobiographical memory of a surprising (or traumatizing) event. The scientists who originally coined the term hypothesized that such powerful memories stem from unique evolutionary imperatives, meaning that these memories would form through neurological processes alternative to those handling everyday events. Different models of flashbulb memories have, however, been challenged through various experiments and the development of the Ebbinghaus Curve of Forgetting. Believing the concept of flashbulb memories to be tenuous, the author uses the concept to suggest the fallibility of the human mind.
This is a term used by historians to refer to prophesied events, such as the apocalypse, that do not come to pass. The book cites such an instance when William Miller, a Christian preacher, falsely prophesied the Second Coming of Christ. Miller’s followers, greatly disappointed, scrambled to find explanations for what went wrong; the author uses their reaction to illustrate what she terms the “how wrong?” phenomenon, wherein a person, upon realizing their mistaken belief, tries to discern how wrong they were and where exactly they erred.
Groupthink is the phenomenon whereby underexposure to challenging information can further solidify our beliefs. Groupthink tends to affect homogenous groups that are insulated from criticism, both external and internal. Symptoms of groupthink include the censorship of dissent and rejection of criticism, a belief in moral superiority, and the demonization of those with opposing beliefs.
Homophily is the sociological term for the phenomenon whereby we favor those similar to us. The author sees this phenomenon in how we often form our beliefs based on our communities but also form our communities based on our beliefs. Thus, when we hold a belief, we also hold membership in a community of like-minded believers.
Under the Idiocy Assumption, we assume that those who oppose our beliefs have all the necessary information but lack the comprehension—hence their disagreement with us. This is an example of how we sometimes unfairly judge the beliefs of others who disagree with us.
Under the Ignorance Assumption, we believe our own convictions to be based on facts and thus conclude that dissenters simply lack the appropriate information. We feel that education would inevitably cause them to see things as we do. Like the Evil Assumption and the Idiocy Assumption, the Ignorance Assumption exemplifies the human propensity for hasty and unfair judgment of others’ disagreement with us.
Inattentional blindness is a phenomenon whereby when we are asked to look for something specific, we develop an inability to see things in general. The author uses this phenomenon to illustrate the perceptual process quirks that leave us vulnerable to error
The incongruity theory of humor holds that comedy emerges from a mismatch between expectation and actuality. We thus hold a belief that is violated—producing surprise, delight, or confusion—and another belief is formed. This model of comedy strongly—a misconception the recognition of which generates new beliefs—resembles the structure of error.
Inductive reasoning takes our limited experiences to make broad guesses about the world, to divide the world into categories, and to understand causal relationships. Many psychologists and neuroscientists believe inductive reasoning is the basis of all of human cognition. However, the author shows that while this system of reasoning is successful, our inductively reasoned conjecture is only probabilistically true, meaning it is fallible.
Naïve realism is the (naïve) attitude that beliefs reflect reality rather than being constructed, as if our minds are mirrors of the truth. Naïve realists believe the world is precisely as they experience it. The author refutes this idea and argues that there are things beyond perception, such as the infrared spectrum and molecular structures.
The No True Scotsman fallacy is the phenomenon whereby we encounter evidence that contradicts our beliefs, but we deem this counterevidence inconsequential to our beliefs’ validity. This logical fallacy is a result of the confirmation bias. The term originates from the California State University philosophy professor Bradly Dowden, who illustrates the fallacy with Person A’s ad hoc reasoning in this exchange:
Person A: ‘No Scotsman puts sugar on his porridge.’
Person B: ‘But my uncle Angus is a Scotsman, and he puts sugar on his porridge.’
Person A: ‘But no true Scotsman puts sugar on his porridge’ (Goldman, David P. “No true Scotsman starts a war.” 2006. Asia Times).
Under the optimistic model of error, the experience of being wrong does not warrant the negative feelings of humiliation and defeat. Instead, this model allows for such experiences as surprise, fascination, hilarity, and delight. Schulz contrasts this model with the pessimistic model of error, in which mistakes are condemned and merit negative emotional responses.
Philosophers of science have developed this argument for the nature of scientific theories. The Pessimistic Meta-Induction holds that because past scientific theories were eventually invalidated and replaced, we ought to assume that current theories will also someday be invalidated and replaced. The author extends this argument to every domain of life, including politics, technology, economics, and law.
Under the pessimistic model of error, mistakes are negative and merit such feelings as humiliation and irritation. However, Schultz states that this model is incomplete because, while it tells us that error is contemptible, it offers no justification for this perspective. The author contrasts this model with the optimistic model or error, in which mistakes are neither abhorrent nor deviant and can provoke a range of emotions, such as delight and surprise, and lead to personal transformation and enlightenment.
The term “platonic love” is named after the ancient Greek philosopher Plato. In Plato’s philosophical system, the highest form of love was intellectual—the love of one mind for another (in contrast to purely physical desire, for example). He claimed this love returns us to cosmic truths and restores us to wholeness. The author states that the Western understanding of love has changed little since Plato’s time and that we view love as the union of consciousnesses.
This belief system was held by fifth-century philosopher Protagoras. It holds that there is no external reality for our senses to perceive; instead, our sense perception is reality. Thus, if two people’s perceptions contradict—for example, if you think a breeze is chilly while another person thinks it is warm—then you two have differing realities.
Under the recording-technology model of memory, our memory is much like a computer’s memory; we do not question the integrity of our stored memories unless we experience a problem. Thus, our memories’ vividness evinces their accuracy. Schulz disagrees, however, pointing out that contemporary neuroscientists tend to agree that memory consists of multiple distinct neural processes through multiple neuroanatomical structures.
The representational theory of the mind is a developmental psychology concept for the stage at which children realize, unconsciously, that their mind is not a direct vessel of reality but rather a private mode of understanding of the world. This cognitive development produces the insight that your beliefs about the world can differ from the world itself, that your beliefs and knowledge can differ from others’ beliefs and knowledge. The author uses this theory to demonstrate the divide between perception and reality.
The self-improvement theory of humor holds that we laugh at the errors of others out of self-recognition; we are shown the errors of our own ways, with comedy imitating that error.
The scientific method emerged during the Scientific Revolution of the 16th and 17th centuries and holds that observations result in hypotheses subject to testing for experimentation. The scientific method seeks falsification; a hypothesis is defined by its potential to be proven wrong, and a theory is defined by the fact that it has not yet been proven wrong. Schulz uses this model of progress to demonstrate how errors illuminate the truth rather than obscuring it.
The Scientific Revolution of the 16th and 17th centuries represented a dramatic change in scientific thought. The movement’s crowning achievement was the development of the scientific method, in which observations lead to hypotheses that are then subjected to experimentation.
Six Sigma is a quality-control process named after the Greek letter sigma (σ), which, in statistical analysis, represents the standard deviation from the norm. Companies that adhere to Six Sigma standards view all deviation as undesirable because it is an error in manufacturing or design. These companies rely on three basic principles: Error is inevitable, so contingency strategies should be put in place; operate with openness; and rely on hard data rather than opinions or assumptions.
Splitting is a psychology term for a developmental stage in children around 12 to 18 months old. This stage is characterized by viewing the world in black-and-white, yes-and-no categories. After this stage, we develop the cognitive and linguistic capacity to acknowledge uncertainty and are thus able to accommodate the possibility of error in our lives.
According to the superiority theory of comedy, humor emerges from the fact that others look ridiculous, which makes us look better by comparison. This model of comedy, the author says, confirms one of humans’ default positions on rightness or correctness—that we possess it, while others do not.
This is the author’s term for the strategies we use to deflect personal responsibility for our errors; someone may say there were “wrong, but…,” illustrating how we append caveats to our admissions of error.
Business & Economics
View Collection
Education
View Collection
Fear
View Collection
Guilt
View Collection
Philosophy, Logic, & Ethics
View Collection
Pride & Shame
View Collection
Psychology
View Collection
Science & Nature
View Collection
Self-Help Books
View Collection
Sociology
View Collection
Trust & Doubt
View Collection
Truth & Lies
View Collection