Being Wrong: Adventures in the Margin of Error (Kathryn Schulz)

‹ Ryan's First Emergency Room Visit | Who should get the babies? ›

My review/summary/synopsis/notes on "Being Wrong: Adventures in the Margin of Error" by Kathryn Schulz:

Part 1: The Idea of Error

Chapter 1: Wrongology

Being right is pleasurable, being wrong is unpleasant.  We have faith in our rightness.  "It is surprisingly difficult to get angry unless you are either convinced that you are correct, or humiliated and defensive about ein wrong."  "The Pessimistic Meta-Induction from the History of Science": Even the most bulletproof scientific theories of times past were eventually proved wrong, so the theories of today will someday prove wrong as well.  Extrapolate into "The Pessimistic Meta-Induction from the History of Everything"

Discusses the problems of defining what it means to be wrong.  Do you believe that truth can be known?  Do you believe that there is objective reality outside of our perceptions?  Are errors with worldwide stakes the same as misplacing your car keys?

In daily life, we use "wrong" to refer both to morality and to errors.

People will also feel right or wrong of matters of taste or opinion.

Possible definitions of being wrong: "Deviation from external reality" or "rejecting as false something we thought was true" -- both are bad because they don't match the varied situations where people use the word "wrong"

"Error-blindness": error does not exist in the present tense: "I am wrong" is a logical impossibility -- as soon as we recognize we are wrong, we stop believing the wrong fact, so we never experience being currently wrong.

People tend to forget specifics of times they were wrong.  Describes Freud forgetting about a patient that he diagnosed with hyteria who died of abdominal cancer.

Most of us don't have a mental category called "Mistakes I Have Made"

Chapter 2: Two models of wrongness

Story about Ross Gelbspan writing a front page story in the Village Voice about Donna Meadows focusing on how she was pregnant, but it turns out that actually she was not pregnant

Two models of wrongness: The pessimistic model says that errors are dangerous and humiliating and distatesful and un-fun.  The optimistic model of error includes surprise, bafflement, fascination, excitement, hilarity, and delight. 

Historical debate about whether error is normal or abnormal -- can error ever be erraidicated?

The bell curve is an example of quantizing error -- Laplace used the bell curve to determine the precise orbits of the planets using many imprecise observations.  Is an example of embracing error.

Freudian slips -- the idea that error exposes the truth

Analogies between errors and altered states ("are you on crack?" "what are you smoking?")  People seek out altered states and dreams and hallucinations

Insanity is also a form of wrongness.  Three factors distinguish madness from wrongness: purity, consistency, and subject matter.  If madness is radical wrongness, then wrongness is minor madness.

There is culture about madmen who speak truth to power (children/fools/madmen in literature, like King Lear after he goes mad and other characters in that play)

"pessimistic" model is personified by "juif errant"

"optimistic" model is personified by "knight errant" (lancelot, galahad, gawain, don quixote)

Part 2: the origins of error

Chapter 3: Our Senses

John Ross/Crocker Mountains/Lancaster "sound"/superior mirages (1818)

Robert Bartlett between Greenland/Iceland July 17, 1939

Experience of the sky moving and you being the midpoint of the world is different from being on a globe that's going arond the sun.

Protogoras believed there is no external truth -- all truth is contained in each person's senses.

One way to understand perception is to think of it in terms of step 1: sensation; step 2: interpretation.  Interpretation can introduce error, like with the "sky turning around the earth" feeling, but is also good (eg it fills in our blind spot)

Optical illusion from Edward Adelson.  Knowing how it works doesn't prevent it from working

Being wrong is often a side effect of a system that's working correctly.  Size constancy is handy heuristic 99.99% of the time.  Superior mirage is the 0.01%

Illusions are the misleading outcomes of normal and beneficial perceptual processes.

inattentional blindness:

  • gorilla costume video http://www.simonslab.com/videos.html
  • 1972 eastern airlines flight 401 -- focusing on landing gear light blinded crew to plane descending on autopilot
  • drivers not noticing cyclists
  • pickpockets working in groups to make distractions

governments and religions have historically exploited perceptual shortcomings.  1833 Letters on Natural Magic by David Brewster describes how to use concave silver to project human images onto smoke.  Napoleon III sent Jean Eugène Robert-Houdin to Algeria to do magic tricks to convince the people in Algeria that the French were more powerful than the region's Islamic holy men.

Illusions teach us how to think about error -- "even the most convincing perception can diverge from reality . . . cognitive processes that we can't detect -- and that typically serve us quite well -- leave us vulnerable to mistakes."

People like optical illusions (contrast with disliking being wrong in other situations) -- it's not only because they're trivial because people "throw tantrums over trivial mistakes" -- what makes illusions different from errors is that we enter into them by consent.

In the rest of life, it's sure that we are going to make errors in the future.

Chapter 4: Our Minds, Part One: Knowing, Not Knowing, and Making It Up

Examplesanosognosia: 

Anosognosia shows that, no matter what, there is always a gap between our perception of the world and the actual world.

Suggestion: Abandon the category of "knowledge" in favor of the category of "belief."  Knowledge is a belief plus a bunch of credentials.  It's hard to determine what, if anything, you can rightly claim to know.

The feeling of knowing something is convincing, but it is not a good way to gauge the accuracy of our knowledge.

Ulric Neisser remembered a radio announcer interrupting a baseball game with a bulletin of the bombing on December 7, 1941.  But 40 years later, it dawned on him that professional baseball isn't played in December.  These vivid memories of surprising and traumatic events are called flashbulb memories.  Neisser surveyed students about the Challenger disaster the day after it happened, and then again three years later.

Less than 7 percent of the second reports matched the initial ones, 50 percent were wrong in two-thirds of their assertions, and 25 percent were wrong in every major detail.  Subsequent work by other researchers only confirmed the conclusion.  Our flashbulb memories might remain stunningly vivid, but research suggests that their accuracy erodes over time at the same rate as our everyday recollections—a decline so precise and predictable that it can be plotted on a graph in what is known, evocatively, as the Ebbinghaus curve of forgetting.

"False memory studies" have been able to convince subjects that they experienced something as a child which they did not, like getting lost in a store or taking a hot air balloon ride.  One in four subjects will accept a false memory.  (30-60% among children).  Once the memory is established, it can be difficult to convince them that the event never happened.

We take our own certainty as an indicator of accuracy -- there is some correlation between the two, but there are some exceptions too.

Discusses Plato's strawman model of memory as a wax tablet -- modern analogies like books, gramophones, movies, and computers.  The model of memory as a storage device is wrong.  Memory is not a single function, but many different processes -- a memory is reassembled by the brain each time we call it to mind.  Vividness might come from how often we call them to mind and how easy it is to do so.  William Hirst theory that some memories are psychologically or culturally unacceptable to forget them.  (like 9/11)  "sometimes remembering becomes a moral imperative."

Discusses confabulation (done by people with Anton's Syndrome)

Dreams are kind of like confabulation -- inner "fact checker" is asleep and doesn't verify whether dreams make sense.  You don't feel surprised during your dream even though it makes no sense.

Confabulation in split-brain epilepsy patients -- scientists gave commands to the right brain ("laugh", "walk") -- then ask the patient to explain why they were laughing or walking.  Patients would make up a reason, like "you guys are too much" or "thirsty and going to get a drink".  Left brain needed to come up with a reason for behavior, but didn't get info from the right brain about the real reason, so left brain theorizes backwards from behavior.  Subjects had no befuddlement, no noticeable time lag, and no appearance of doubt or intent to deceive.

Healthy people also confabulate: Richard Nisbett and Timothy Wilson (1977) asked shoppers to compare 4 varieties of pantyhose.  Actually all 4 were identical.  Shoppers came up with a preference, and also explanations for that preference.  When confronted with the info that all four were identical, shoppers stuck with their explanations and preferences.

Alzheimer's disease and dementia are also associated with confabulation.

Hirstein (author of Brain Fiction): Admitting ignorance in response to a question isn't a low-level of function -- it's a high-level cognitive ability that confabulators have lost.

We are bad at knowing when we don't know something.

Sometimes, it's obvious: "Who is the prime minister of Kyrgyzstan?", but often there isn't necessarily a feeling of blankness to use as a guide.

Most of us are better at generating theories than registering our own ignorance.

People get into discussions about things they don't know much about.

We can come up with musings or half-baked ideas about something, and then become wedded to them by being contradicted or otherwise interacting with other people.

Recognizing the limits of our knowledge is extremely difficult.  The feeling of knowing and trusting theories that come to mind lead too easily into error.  We have no sound method for knowing what we know.

Author proposes that "belief" is broader/more complex/more interesting category than "knowledge".

Chapter 5. Our Minds, Part Two: Belief

October 23, 2008: Alan Greenspan testifies before congress.

This book is about "Greenspan moments."  When beliefs fail us.

What is a belief?  In casual conversation, it's a conscious belief, like about morality or politics or ourselves or others.

Philosophers include all unconscious beliefs too, like beliving that the sky is dark outside if you're in your bedroom at night with the blinds drawn, that the sun won't rise for many more hours, that when it does, it will do so in the east, that the mattress is solid, that a flying saucer isn't about to crash through window, etc.

Both explicit beliefs "my father-in-law dislikes me" and implicit ones "mattress is solid" serve function of helping me figure out where to sit when I enter a room.

Once an implicit assumption is violated, it becomes explicit.  If I suddenly fall through the floor, my implicit assumptions about the solidity of the floor suddenly appear in my consciousness.  The beliefs at the extreme ends of the implicit/explicit spectrum collapse most spectacularly when they are discovered incorrect.

Holding a belief can have consequences.  Belief in general relativity led to spending $300 million and $30 million per year on LIGO.  If you believe your love is nervous because he plans to propose, you will be excited, if you believe he is going to break up with you, you will be anxious.

Why do we have distal beliefs (like about South Africa's AIDS policy, or what the closest star to Earth is)?  Because we are good at theorizing for survival ("is that noise predator or prey?").  Because we need to be able to theorize about some things, we ended up able to theorize about everything.  Theorizing process is rapid and automatic and doesn't require us to deliberately engage it, so we can't stop theorizing.  We tend to mainly notice our theories when they're wrong.  Babies as young as seven months are already theorizing about gravity.  Alison Gopnik posited that theory drive exists specifically for early childhood, but operates throughout lives, just like sex drive exists specifically for fertile years, but operates both before and after.

Although we are good at making theories, we aren't good at realizing we made them.  We have a tendency toward "Naïve realism" -- that our perception matches reality.  This can't be true because there are things that we can't perceive (like infrared, molecules).  All children under the age of four are Naïve realists—they believe that we can't believe things that are wrong.  (Sally-Ann task).  Caveat: it now appears that fourteen month olds can pass the Sally-Anne task.  Another experiment: Show a child a box labeled "candy".  They open it and find pencils.  Then ask them what they thought was in the box 20 seconds earlier, and they say "pencils."  Children think that perception and reality are one and the same.

By age five, children acquire "representational theory of mind."

Autistic children do better on a version of Sally-Ann task where they are asked where the candy bar will appear in a polaroid picture, while normal children find the polaroid picture version harder.

Adults are convinced that our own beliefs are true (dubbed "'Cuz It's True Constraint" by the author).  CEO of green tea company drinks green tea because it's good for health, but others will see that CEO as having cognitive dissonance.  If I believe P, I feel that the reason I believe P is because P is true—I don't feel that I believe P for other reasons.  There are caveats: this constraint doesn't hold for thigns I no longer believe, and it doesn't hold when we think about our entire set of beliefs in the abstract, and it doesn't hold when we think about other people's beliefs.  ("the bias blind spot").

If you believe that your beliefs are true, you will assume that those that disagree with you are ignorant of facts, idiots, or evil.

Chapter 6. Our Minds, Part Three: Evidence

What counts as evidence?  There's a model of thinking where people gather evidence, assess it, and draw conclusions.  Seems reasonable to caution people against believing things without sufficient evidence.  But how do you decide how much evidence is enough?  Anyway, people are built to draw conclusions based on meager evidence, and that's an advantage.

Example: "Complete the following sentence: 'The giraffe had a very long ____'"

A. Neck / B. I'm stumped.

Human beings don't care about all possibilities, only about what's probable, and we determine what is probable based on our expeirence (inductive reasoning), and that's what makes our brains powerful.  Inductive reasoning is better than careful consideration of the evidence.  But because inductive reasoning is quick, it also allows the possibility of error.  What makes our brains great is what makes us err.

Failures in inductive reasoning: racism, sexism.

Inductive reasoning lets us come to a conclusion based on small amounts of evidence, but does not usually allow modifying a conclusion based on further small amounts of evidence.  When it works, we call it inductive reasoning.  When it fails, we call it confirmation bias.  "No True Scotsman fallacy."  "Increased violence in Iraq is a sign that the enemy is frustrated with American success."

One form of confirmation bias: where we simply fail to look for information that might contradict our beliefs: The belief that women had one more rib than men survived until 1543 when Andreas Vesalius finally counted.  Pliny's views on menstruation in Naturalis Historia lasted for 1500 years.  There's a threshold of new evidence above which our beliefs will change, but in many cases, that threshold isn't reached.

We must remember to combat our inductive biases--attend to counterevidence as a habit.

Chapter 7. Our Society

Switzerland: Women were not allowed to vote in Switzerland until 1971.  Cantons in Switzerland only started granting suffrage in 1959, and in Appenzell Ausserrhoden, women couldn't vote until 1989, and in Appenzell Innerrhoden, women were only granted suffrage because of a supremem court decision in 1990.  Women in Switzerland had to wait on average seven centuries longer than men for the vote (contrast with average of 47 years in other nations).

Roger Bacon wrote "Opus Majus" in 1267 and categorized error as coming from four problems:

  1. tendency to cover up ignorance with pretense of knowledge
  2. persuasive power of authority
  3. blind adherence to custom
  4. influence of popular opinion

Francis Bacon categorized sources of error into four "idols":

  1. Tribe: human cognitive habits
  2. Cave: we distrust beliefs foreign to our own clan
  3. Marketplace: influence of public opinion
  4. Theater: false doctrine propogated by authorities

Consider the idea that if everybody is doing something, it must be a good idea.  Contrast with "don't be a lemming/think for yourself."  Glorification of independent thought can lead to conspiracy theories.  Our own direct observations aren't trustworthy.  It's not even possible to think completely for yourself--every time you read a newspaper, board an airplane, read wikipedia, vaccinate your children, or assume that your parents are your parents, you are relying on second-hand information.

Relying on second hand information buys us a lot of time, and gives us access to much more information and knowledge than we'd get on our own.  We trust a source and therefore accept that source's information.  We are surrounded by a "network of witnesses", and thus our beliefs are influenced by our commuity.  We are also drawn to be around those with similar beliefs to our own (communities on the internet, landslide Democratic/Republican districts).

Asch conformity experiment

Gregory Berns replicated the Asch experiment in fMRI, and showed that subjects were not engaging decision making/conflict resolution, which suggests that the subjects weren't suppressing a correct answer to conform, but that the group actually changed their perception.

If three fake subjects can influence us to see line length differently, how much more do our communities of large numbers of people influence our real, less straightforward beliefs?

The Swiss cantons that did not grant suffrage to women did so because of tradition (Landsgemeinde).

"disagreement deficit":

  1. Communities expose us to disproportionate support for our own ideas
  2. They shield us from disagreement of outsiders ("religious fundamentalists generally don't read Darwin in their free time.")
  3. They cause us to disregard outside disagreement -- we tend to reject information from unfamiliar or disagreeable sources.
  4. They quash the development of disagreement from within

One of the quickest ways to find out if you are wrong is to state what you believe, but people don't do this.

Groupthink, and how to fight it.  The Talmud says if there is a unanimous guilty verdict in a death penalty case, the defendant must be allowed to go free (to ensure that there was a dissenting opinion).

Abdul Rahman converted to Christianity in 1990.  Wife divorced him, lost custody of kids, convicted of apostasy in 2006, imprisoned, sentenced to death, granted asylum by Italy.  What gets you into trouble with a community is abandoning a belief it cherishes.  Afghan judiciary didn't sentence born and bred Christians to death, it was rejection of Islam that was objectionable.

In Asch studies a single dissident ruined the effect, which might mean that a single dissident could destroy a community, thus communities punish those that dissent.

Changing one's belief calls the whole nature of believing into question.

We would like to believe that we'd be in the 25% of the Asch study.  We don't know which of our present beliefs will seem indefensible to us or to history in the future.  Ideally:

The people around us prevent us from believing things that are (as Penn Jillette put it) "fucking nuts," while our own inner voice keeps rising up and breaking the surface tension that could otherwise turn a community into a bubble.

Chapter 8. The Allure of Certainty

The story of the original Zealots.  Ambrose Bierce defined certainty as "being mistaken at the top of one's voice."

Certainty is the conviction that we cannot possibly be wrong.

The certainty of those with whom we disagree . . . never looks justified to us, and frequently looks odious.  As often as not, we regard it as a sign of excessive emotional attachment to an idea, or an indicator of a narrow, fearful, or stubborn frame of mind.  By contrast, we experience our own certainty as simply a side-effect of our rightness.

Certainty is lethal to imagination and empathy, and someone who is certain cannot shift perspectives.  Why do we continue to find certainty attractive?

Certainty has practical advantages.  William James gave an example of a hiker who needs to jump over a crevasse---it's better for the hiker to be certain that he can do it.  William Hirstein called confabulation "pathological certainty" and contrasted it with obsessive compulsive disorder, which he called "pathological doubt."  Even if your partner reassures you that you locked the door, you still don't feel sure that it is locked.  "Doubt is to certainty as neurosis is to psychosis."  Certainty sometimes inspires people to change the world for the better.  Ludwing Wittgenstein argued that to get through life, we need to treat some beliefs as absolutely certain, for example, Wittgenstein's convication that he has two hands.  If I try to verify the number of hands I have by using my eyes, am I testing the number of hands I have, or am I instead testing whether my eyes work?

Taking the time to interrogate a belief requires cognitive resources.  William Hirstein called doubt "a cognitive luxury [which] occurs only in highly developed nervous systems".  You won't find a skeptical mollusk or a skeptical one-year-old.  Wittgenstein: "The child learns by believing the adult".  Doubt is a skill, while credulity is more like an instinct.

Daniel Gilbert 1990 study to test claim by Baruch Spinoza: When we encounter a new piece of information, we accept it as true, then if we decide to reject it as false, that only happens later.  Gilbert's example: "armadillos may be lured from a thicket with soft cheese."  Other examples: swerving car to avoid a dachshund in the middle of the road before deciding whether or not it is actually there, or swerving car to avoid a unicorn in the middle of the road before considering the fact that they don't exist.  Gilbert found that people are more likely to believe false statements are true if distracted immediately after exposure to the statement, but not more likely to believe that true statements are false.

Certainty feel good emotionally.  When we feel certain, our knowledge about the world seems like it is complete.  Our dislike of doubt is a kind of emotional agoraphobia.  Doubt is uncomfortable.  Certainty reassures us with answers; Doubt confronts us with questions.  Doubt makes it obvious that we can't prevent accidents and disasters.

Discusses Hamlet: King tells Hamlet to kill Claudius.  Modern reading of Hamlet is that doubt is his tragic flaw.  But consider his situation.  He has been asked to commit a terrible crime by a ghost.  We don't expect normal people to be certain in situations, but we expect it of leaders.  We conflate certainty with correctness.

Presidential election between John Kerry and George Bush in 2008 was a choice between a man who wavered and a man who screwed up.

John Maynard Keynes quote: "When the facts change, I change my mind.  What do you do, sir?"

But politically, staying the course is admired without regard to where the course leads.

Undecided voter: It is common in popular culture to mock and insult the undecided voter.  Perhaps it is because the electoral process and our political future is "held hostage by the tiny fraction of votes who can't make up their minds."

There's an apparent conflict between having strong convictions and being aware of being wrong.  Rollo May quote: our commitment to an idea is "healthiest when it is not without doubt, but in spite of doubt."

Leon Festinger work on certainty in the 1950s: Found that followers of Marian Keech grew more fervent after Keech's predictions did not come true.

Describes cognitive dissonance.

We are all prone to using certainty to avoid facing up to the fact that we could be wrong.

Fear of wrongnes makes it harder to avoid errors, and harder to forgive ourselves and others for making them.

Part 3: The Experience of Error

Chapter 9: Being Wrong

This chapter describes what happens during being wrong. We avoid being wrong in the present tense in two ways:

  1. By changing our beliefs slowly over many months or years.  Greg Markus asked 3000 people in 1973 to rate their stances on a range of social issues.  A decade later, he asked them to rate their stances again and recall how they had felt.  Ratings more closely reflected current beliefs than 1973 beliefs.  Philip Tetlock studied accuracy of political forecasts by experts.  When contacted, the experts misremembered their forecasts, believing them more accurate than records showed.
  2. By changing our beliefs suddenly, jumping from "right" A to "right" B. Thomas Kuhn described a similar effect in science where a "theory is declared invalid only if an alternate candidate is availalbe to take its place." Similarly, we rarely evaluate our current beliefs, we hold them until something better comes along.

Occasionally we don't have an active belief.  "terrain of pure wrongness" where a belief has fallen apart, but nothing is available to replace it.  This happens with really big, important beliefs (Bernie Madoff is a financial genius, deity X exists, being wrong about someone you loved, betraying your own principles).

Story of "Anita Wilson" (name changed): Started as (her words) "crazy evangelical," was accepted into art school in New York.  Church member who was close to her killed in car accident.  She met an atheist in New York and fell in love.  While together, she adopted atheism, but when the relationship fell apart, she didn't believe either belief system anymore.

Persian philosopher al-Ghazali wrote "There can be no desire to return to servile conformism once it has been abandoned, since a prerequisite for being a servile confirmist is that one does not know [oneself] to be such."  But when someone recognizes his former beliefs as false, "the glass of servile conformism is shattered—an irreparable fragmentation and a mess which cannot be mended by patching and piecing together."  Instead "it can only be melted by fire and newly reshaped."  Once you feel that you believed something for reasons other than the truth of that belief, you've destroyed your ability to keep believing it.

Anita described this experience of being "wrong" as chronic terror, being a "lost toddler".

Being wrong strips us of all theories, but makes possible real change.  Being wrong is the transition inherent in change.  It is where we destroy and rebuild ourselves.

Mere exposure to the idea that we are wrong often isn't enough to change our minds.  We receive information that we are wrong and discard it.  Story about biking down a trail and a crotchety old man telling her she's on the wrong trail.  She ignored him and hit dead end into private property eight miles later.  What caused her to ignore man didn't have anything to do with whether or not he was right, but instead who he was.

Physical world can call attention to our errors, but can't force us to acknowledge them.

Sunk costs can irrationally change our decisions.  Similarly, cognitive dissonance can be seen as sunk costs in beliefs.  Beliefs with more sunk costs garner more loyalty to those beliefs, independent of whether or not the belief is true.

How do we acknowledge that we were wrong at all?

  • How often we are exposed to challenges to our beliefs (not "living in a bubble")
  • Whether the people around us make it easy to accept our errors.  The cult members studied by Festinger faced public ridicule when their prophecies failed to come true, which made it harder for them to admit they were wrong.

What works to encourage people to admit error is chaotic and easily perturbed.

Admitting mistakes is a skill which develops in tandem with cognitive development.  Teenagers think they know everything and point out other people's errors.  They have high confidence they are right, and disdain the perceived errors of others.  Admissions of error are seen as "selling out."

In contrast, the widsom of the elderly stems from the knowledge that nobody knows everything.  In the elderly, cognitive degeneration can lead to becoming set in their ways again and like adolescents again and certain about their rightness.

Ira Gadd quote: "Our capacity to tolerate error depends on our capacity to tolerate emotion."  Mistakes require us to feel something: dismay, foolishness, guilt, etc.

Rigidity allows us to avoid experiencing those emotions and instead be stubborn or defensive or mean.  Irony is that those emotions aren't great either and often create conflict with people.

Our ability to accept error can be cultivated.  The rest of the book aims to help us cultivate this ability.

Chapter 10: How Wrong?

Introduces the "Great Disappointment" between 25,000 and 1,000,000 people from a cross-section of society gathered to await the Rapture.  William Miller, the leader of the movement, renounced his faith in Christianity as a young man and started believing again during the War of 1812.  Miller saw the Battle of Plattsburgh as evidence of God.  Miller attempted to reconcile the contradictions in Christianity by creating a set of 14 rules, which he then used to conclude that the end of the world was at hand.

Miller met Joshua Himes who was "Rasputin, Warren Buffet, Karl Rove, and William Randolph Heart rolled into one: advisor, fundraiser, politician, and PR genius".  Himes launched two newspapers: Signs of the Times and The Midnight Cry which achieved circulation of 60,000.  Along with other promotional strategies, Himes turned Millerism into a household word.

A Miller follower, preacher Samuel Snow proposed October 22 and presented calculations to justify it.  Miller was finally convinced two or three weeks prior to October 22 after so many Millerites already believed it so strongly.

Millerites believed so strongly that they didn't plant their fields, gave their money and property away, slaughtered their animals to feed the hungry. Himes shut down his newspapers at the beginning of October.

Washington Morse (a Millerite): "disappointment to the Advent believers . . . can find a parallel only in the sorrow of the disciples at the crucification [sic] of their Lord."

Hiram Edson: "it seemed that the loss of all earthly friends could have been no comparison.  We wept, and wept, till the day dawn."

Luther Boutelle: "unspeakably sad [were] the faithful and longing ones.  Still in the cold world! No deliverance—the Lord not come! . . . All were silent, save to inquire, 'where are we?' and 'what next?'"

They knew that Millerite doctrine was wrong.  But how wrong?

Figuring out where we went wrong and how wrong we are is difficult. Defensiveness is counterproductive for problem solving because if we don't accept mistakes, we can't analyze them. Answer to "how wrong?" matters a lot. 

Thomas Kuhn: Post breakdown of belief system is characterized by explosion of competing hypotheses.  Each is a different answer to "how wrong?"  Each hypothesis determines a possible place we will end up in the future.

Person A can reject the Catholic position on homosexuality and also reject man other Catholic teachings, but Person B can reject the Catholic position on homosexuality, but continue to embrace its other teachings.

Some Millerites denied that they were wrong and believed that Christ had returned to earth and entered the hearts of his followers.  Others parted with teachings of Miller.  Others parted with organized religion.  Others parted with belief in God.

One limit to answer to "how wrong?": denial of wrongness.  Opposite limit: acceptance (of "fractal wrongness" in the limit, a term from Keunwoo Lee: "being wrong at every conceivable scale of resolution")

This chapter focuses on the middle ground between those two.

Donald Norman: "the error correction [process] seems to start at the lowest possible level and slowly works its way higher."  We first suspect that something is wrong with the key, then with the lock, then that we have the wrong car, then that we only imagined having a car in the first place, then that aliens came down and injected superglue in the lock.

Conservatism about the size of our mistakes makes sense sometimes, but other times looks desperate and emotional.

Hiram Edson went on to have a vision that Christ had moved to a second compartment in heaven, and would begin judging.  Doctrine called "Investigative Judgment".  Believed today by 15 million Seventh-Day Adventists in 200 countries.

Some might call Edson's vision either confabulating, or alternatively just saving face.

Many people try to save face: "I was wrong, but...":

  • "I was wrong, but wait until next year" (time-frame defense): Millerites waited again on Oct 23, 6pm, Oct 24, Oct 22 1845, Oct 22 1846 . . . Oct 22 1851.  George Bush in 2006: "[I will be vindicated by] the long march of history."  This defense is always available and makes you sound not only right, but visionary, but is often ludicrous.
  • "I was wrong, but only by a little" (near-miss defense): Christ didn't come down to earth, but he had entered the most holy compartment of heaven. "If only X didn't happen, my forecast would have been correct." This boils down to: If I hadn't been wrong, I would have been right.
  • Variant: Out-of-left-field defense: Was on track to being right when unforeseeable event happens.  Problems: any event can be defined as unforeseeable; also assumes that forecast would have come true if event didn't happen.
  • "I was wrong, but it's your fault." I trusted a source or leader that was unreliable or dishonest or deluded.  Millerites weren't coerced or kept in the dark or defrauded.  They placed their faith in an expert who turned out to be wrong.  This is partly reasonable because we must get much of our knowledge from third parties.
  • "better safe than sorry": Better to warn people of what you believe is true rather than regret sharing your belief later. Pascal's Wager.

Although none of the above excuses are completely unfounded. we don't use any of the converse "right but..."s ("however right I look now, I will be proved wrong in the future")

There's a fine line between explaining our mistakes and attempting to justify them.

Chapter 11: Denial and Acceptance

Story of Penny Beerntsen rape on July 29, 1985.

"witness" comes from "wit" (knowledge).  Unwittingly means without knowing.  Eyewitness testimony is heavily relied on in the legal system and is among the oldest form of evidence accepted, only recently starting to losing its throne to DNA evidence.

1902 experiment by Franz von Liszt: Two students argue, one pulls a gun, professor jumps in, shot is fired.  The whole thing is scripted and staged.  Best eyewitnesses got more than 25% of facts wrong, worst got 80% wrong.

Interestingly, "first person" in a literary context doesn't mean reliable, it means non-omniscient, unreliable and subjective.  It's only "one person's story."

Steve Avery was wrongfully convicted for the attack on Penny and sentenced to 32 years in prison.

Glen Woodall was released from prison in 1991 and given a million-dollar settlement by the state.  Despite the evidence that he was wrongly convicted, one of the two victims ran up to the van that was transporting him, weeping, banging on the door, and prevented it from being opened.  She went through a terrible ordeal, only to find out that she played a starring role in someone else's terrible ordeal.  It is easier to choose through denial not to face this situation with acceptance.

Freud: Denial is the refusal to recognize the existence or truth of unwelcome facts.  Freud classified denial as a defense mechanism we unconsciously employ to protect ourselves from anxiety or distress.

Elisabeth Kübler-Ross included denial in five stages of grief.

At least 20 percent of seriously ill people who are told they are near death actually forget the news within a few days.

Denial of illness or loss is the same thing as denial of error.  With error, it is defense against the experience of being wrong.

True sincere and subconscious denial is different from cynical and conscious denial

Campaigning in 1932, Franklin Roosevelt made a speech in Pittsburgh promising that, if elected, he would balance the budget. A few years later, with the budget far out of balance, President Roosevelt asked his speechwriter, Sam Rosenman, how he was to explain. Rosamond said, "Deny you have ever been in Pittsburgh."

To be in denial is to deceiver yourself.  Sartre: to be self-deceived, "I have to know this truth very precisely in order to hide it from myself the more carefully."  You have to know not to surprise your cheating spouse at the office or read more of the email that she left open on her computer or ask too many questions about her weekend in order to remain convinced that she is faithful.

How can the mind deceive itself.  It's still a mystery.  Though Plato, Aristotle, Augustine, and Freud propose the idea of a semi-warring parts of the self, Sissela Bok points out that a partitioned self is only a metaphor.  How the mind works bears on the moral status of denial.  Should we be held accountable for refusing to admit that we are wrong?

Peter Neufeld (of Innocence Project) describes resistance by prosecutors to seeking out the truth (opposing requests for DNA testing), or insist that the test (once done) is flawed, or come up with new theories of how crime was committed, or say claim they will retry the case once the person is exonerated and released.

Example: Case of Jimmy Ray Bromgard.  Convicted of raping 8 year old girl. After DNA testing showed that semen couldn't have come from Bromgard, McGrath suggested Bromgard was a chimera (there have been only thirty cases reported in humans anywhere, ever). Arnold Melnikoff (head of state crime lab) testimony was found by other forensic scientists to have egregious misstatements. Bromgard was freed and sued Montana.  McGrath was deposed and continued to insist that Bromgard was guilty, proposing a lot of absurd theories as to how.  Bromgard's suit was settled for $3.5 million.

Story of Calvin Johnson who was arrested for rape in 1983.  Prosecutor also came up with new theories once evidence didn't match Johnson's hair.

Why do prosecutors do this?  Maybe because reputations/careers are on the line.  But maybe because they are in denial to protect themselves from the trauma of having convicted or killed the wrong person.

Penny Beernsten accepted that she was wrong, wrote letter to Avery apologizing.

Steven Avery was arrested less than four years after he was released, in March 2007 for the 2005 murder of Teresa Halbach.  Makes the story more complicated, and Schulz (the author) didn't want to include it in the book. But that's the same attraction to a simple story that might partly motivate prosecutors.

From outside, denying error looks irrational, irresponsible and ugly while admitting it looks like courage, honor, grace.

Chapter 12: Heartbreak

As infants, our survival depends on caretakers grokking our needs.  Later, we learn to grok others.  Irna Gadd: "One of our earliest and most important developmental challenges is to learn to interpret the emotional tone of a moment correctly."  If we misread emotional tone, other person may ignore or correct or be irritated or enraged.  Later, being right about others also feeds into who we choose to believe.  Avishai Margalit: "We exist in a web of witnesses, not a web of beliefs".

Our need to be "gotten" by other people continues into adulthood.  Irna Gadd: "People frequently...want to know: Are you married? . . . have children? . . . divorced? . . . gay? . . . a New Yorker? . . . What they are really asking is: Will you understand me?"

Being wrong about love in literature: Charles Swann, Scarlett O'Hara, Pip, Cécile. (Or reverse direction: When Harry Met Sally, Pride & Prejudice).  Quintessential psychological story.  We can't ever overcome the fundamental separation between our inner world and another person's inner world.

Being wrong about someone else's mind feels different than other kinds of error because:

  • We all have minds and bodies, so we can empathize with each other on that level
  • We can communicate with each other (although this is falliable due to deliberate deceit, lack of self-awareness, "I'm really stressed out" might alternatively mean "ask me how I'm doing" or "leave me alone")
  • We extrapolate from our own experiences

Thomas Nagel's "What Is It Like to Be a Bat?".  Just as we cannot comprehend what it feels like to navigate with sonar, we cannot comprehend someone whose experience is very different from ours.  Sighted/hearing person cannot comprehend person blind/deaf from birth and vice versa.  Withdrawing assumption that another person is like us due to lack of understanding of his reality creates the preconditions for cruelty.  Example: one way to undermine the golden rule is to deny similarity to you: e.g. "Slaves are not intellectually advanced enough to benefit from liberty, and thus don't need it."

We can only ever know others from the outside, and so we assume that we can understand them from the outside, and conversely we assume that we ourselves can only be truly known from the inside.

Psychological studies find that people in shared living situations overestimate the amount of chores/work they did on relationship, and underestimate the chores/work done by others.  The effort I put in is real to me, while the effort you put in is only an abstraction in comparison.

Emily Pronin, Justin Kruger, Kenneth Savitsky and Lee Ross study "You Don't Know Me, But I Know You": Participants filled in word fragments like "_ _ N N E R", "B _ _ T", "C H E _ _" and were asked to explain what they thought their responses revealed about their interests, motivations, disposition.  Then given responses from another participant and asked what their answers revealed about them.  Answers about self claimed that word completion did not reveal much about personality.  Answers about other participant often described detailed traits: "I think this girl is on her period," "He seems to focus on competition" etc.

"We regard others as psychological crystals -- everything visible, and ourselves as icebergs -- everything submerged."

Possibly apocryphal J. M. Coetzee quote: Robinson Crusoe is favorite novel because the story of a man alone on an island is the only story there is.  There is fundamentally a rift between our mind and the rest of the world.  Love is an attempt to bridge that gap and rid ourselves of loneliness of existence.

Plato's Symposium, "origins of love": Humans were each made up of two people.  Zeus split them in half, leaving all descendents to search for their missing counterparts.  Notion of love as union of souls persists until today. "whatever our souls are made of, his and mine are the same," or "two hearts living in just one mind," "soul mates," "better halves," "Let me not to the marriage of true minds / Admit impediments."

Also common is the idea that love transcends time.  Harvill Hendrix: Falling in love can be distilled two four emotions: feeling of recognition, feeling of timelessness, feeling of reunification "I no longer feel alone", and feeling of neccesity "I can't live without you".  We believe that we will be with our partners and vice versa forever.

Yet, we fall out of love all the time, so our feeling of love obviously doesn't correspond to reality.  We also speak of love as being blind: "intoxicating," "besotted," "drunk on love," "madness of love" (Socrates), "crazy in love" (Beyoncé).  Being wrong about loving a person is very similar to being wrong in general---comes about for similar reasons, etc.  We form lasting impressions of other people within the first 60 seconds of meeting them and often within 2 seconds.  Thomas Gilovich: particularly true of negative first impressions, since we avoid people we have triaged as people we don't like.

If we let go of the idea of love as transcendent of time and transcendent of the fundamental rift between our mind and the world, we also lose the protection that it purports to offer us against loneliness and despair.  Believing in transcendent love makes the moments between love seem like the exception instead of the rule.  This is similar to how we normally take "rightness" to be the "normal" state and error to be the exception.

Love of God also fulfills this same need, and similarly, feeling like you were wrong about God is devastating due to the "existential chasm" that opens, as opposed to merely the loss of faith.

Raoul Felder, celebrity divorce lawyer: People walk into his office with two predominant emotions: feel deeply wronged and incredibly righteous.  All blame is put on the other party.  It's like the two sides are talking about different marriages.  Marriage is probably the biggest financial deal you'll ever make, and one of biggest personal decisions, so it's a terrible shock when you're used to authority and control and you have to accept your own fallibility that you picked the wrong person.  "I trusted this person, told innermost secrets, and I made a mistake, I'm no judge of people."  People enter denial: "he/she is just feeling/doing this temporarily/midlife crisis, and he/she will come back to me/come to senses."  Or: "you're not the person I fell in love with".

Harville Hendrix, marriage counselor: People falling in love experience a kind of merger of consciousness.  Differences in preferences eventually appear.  Actual differences don't matter, but existence of them does.

When strangers disagree with us, we can ignore/dismiss/denigrate them.  If we do that with loved one, leads to unhappiness.  Better option according to Hendrix is to accept partner's reality alongside our own.  Listen and listen and listen until you finally get it that your partner has their own inner world.  You have to move from reactivity to curiosity.

We conflate wanting to be right (about our own claims/preferences) with wanting to be valued.  Silly squables can thus blow up into battles about whether our partner listens/cares/understands.  If we feel loved, we can learn to live with disagreement.  New model of love: Instead of sharing each others world as we would share a soul, share each others world as we might share a story.

Chapter 13: Transformation

Story of Claiborne Paul Ellis who went from Exalted Cyclops of the Durham KKK Klavern to willing participant in integration workshops in a couple of weeks.

Other "conversion stories": Saul, Augustine the Manichean.  Commonality: errors that show us that we were wrong about ourselves.  Transformation is being wrong about our identity or about who we thought we were.  Our sense of self is comprised of a bunch of beliefs, which can be in error.  (Thought we didn't want kids, thought we'd grow up to be a doctor, thought we couldn't be happy living in L.A., though we'd never succumb to depression, thought we'd never begin an affair.)  "Our private history...is littered with discarded theories"

Buyer's remorse: failure to accurately predict our desires/beliefs/emotions 3 days into the future.  More general pattern: crave, acquire, regret (milkshake, one-night stand, tattoo, ideology).  Claim: we can also regret choices that we deliberated over at great length.  We don't know ourselves well enough or remain static for long enough to consistently come up with the right answer.

To agree that we can be wrong about ourselves, we must accept the perplexing proposition that there is a gap between what is being represented (our mind) and what is doing the representing (also our mind)

Augustine on imperfect self-knowledge: "The mind is too narrow to contain itself entirely.  But where is that part of it which it does not itself contain?"

We want self-consistency in ourselves, it makes us feel grounded, confident, safe, sane.  But not always: (self help books, conversion stories of redemption).  But we also are suspicious of change: "a leopard doesn't change its spots", "you can't teach an old dog new tricks).

Certain beliefs form our grounding sense of identity (I "am conscientious," "have a temper," "am shy," "am good with numbers," "have a short attention span", "am someone my friends can rely on", or "There is a God," "education is important" . . .)

Error challenges both that we know who we are, and also challenges that we are who we are.

Whittaker Chambers: 1925: joined communist party, became Soviet spy for 5 years.  1938: broke with party, found God.  Testified against Alger Hiss, former friend and alleged fellow spy.  In Witness:

I cannot say I changed . . . . There tore through me a transformation with the force of a river which, dammed up and diverted for a lifetime, bursts its way back to its true channel.  I became what I was.  I ceased to be what I was not."

"conversion" comes from Latin verb meaning "to return."  Islam converts are sometimes called "reverts."  Also: "coming home," "returning to the flock."

This idea is invoked by "doth protest too much"—someone is protesting so stridently that they must secretly agree.

Carl Jung: conscious and unconscious beliefs are in opposition.  The more someone defends a belief, the more internal doubts they have, and the doubts will eventually win.  The more we accommodate counterevidence and doubt, the more stable our beliefs and identities will be.

If there's a "true self" that we're destined to become, then our life has a larger purpose, but it means once we change, we will never change again.  "friends don't let friends be wrong" idea, where people console each other by saying "it was meant to be" or "for the best."  But doign that just reinforces the idea that wrongness is intrinsically negative.  This is the pessimistic model of wrongness.

Alternative: optimistic model of wrongness, where wrongness and change is a natural and ongoing process.  G.W.F. Hegel: "The bud disappears when the blossom breaks through, . . . and we might say that the former is refuted by the latter ; in the same way when the fruit comes, the blossom may be explained to be a false form of the plant's existence."  But we don't usually think of the bud and the flower as failures or falsehoods, and we don't need to think that of ourselves either.

Kids are wrong all the time.  Children under four have no idea what brains do, and think that dolls are as likely to have brains as humans.  Children under seven can typically accurately name only three things inside the body: blood, bones, heart, e.g. one child: "lungs are for your hair.  Without them, you couldn't have hair."  Most kids under eight think that boys can become girls and vice versa by changing hair and clothing.

Kids suffer from shortage of data (sometimes deliberately witheld e.g. sex/death, or fabricated e.g. Santa Claus).  Susan Carey: child who "gravely explained that to create a baby, parents first buy a duck, which then turns into a rabbit, which then turns into a baby." <= that child learned it from a book.

Being wrong as a child isn't seen as isolated incidents, but a constant process inextricable from learning/growing up.  Playing with blocks, or with sand is a science experiment.  Error might play the same role for children as it does for scientists: inspiring them to take notice and generate new theories.  Error is one mechanism for learning.

As adults, we don't spend as much time playing/exploring/learning.  Some exceptions:

  1. travel, where we often want to experience the world as new and embrace the possibility of being wrong and of change.
  2. psychotherapy, where we explore ourselves.  Ronnie Janoff-Bulman: "[theraputic interventions] can all be considered attempts to get the client to question and change old assumptions and construals of reality."  Heinz Hartmann: "a great part of psychoanalysis can be described as a theory of self-deceptions."

Being wrong can make us more compassionate.  To be judgmental, we must feel sure we know right from wrong and that we won't confuse the two.  Experience of error proves that we don't.

During 1988 Democratic Convention, C.P. Ellis: "the cameras zoomed in on one Klansman.  He was saying 'I hope Jesse Jackson gets AIDS and dies.'  I felt sympathy for him.  That's the way my old friends used to think.  Me, too."

Contrast: Chambers: "that communist was not really me."  Vs: Ellis: "that racist was me."

Takes courage to leave past self behind, and more to accept our wrong past self.  Foucault: "the main interest in life and work is to become someone else that you were not in the beginning."

Part 4: embracing error

Chapter 14: The Paradox of Error

Institute of Medicine: "690,000--748,000 patients are affected by medical errors in U.S.  44,000--98,000 die."  Makes medical mistakes eigth leading cause of death over breast cancer, AIDS and motor vehicle accidents.

Physicians have ethos of covering up.

After Harm: "Observing more senior physicians, students learn that their mentors and supervisors believe in, practice and reward the conealment of errors . . . they learn how to talk about unanticipated outcomes until a 'mistake' morphs into a 'complication.' Above all, they learn not to tell the patient anything."

BIDMC Summer 2008 surgery (one of 175 per week) was done on wrong side of patient.  BIDMC had goal set in January 2008 to eliminate all preventable medical harm by January 1, 2012.  BIDMC emailed entire hospital staff of 5000 and sent press release to local media.

Is error basically eradicable or inevitible?  Practical answer is in between.

Paradox: If you want to eradicate error, you must assume it is inevitable.  If you want to be right, you have to acknowledge your fallibility, seek out your mistakes and analyze them.  This is what is done in transportation, industrial design, food and drug safety, nuclear energy etc.

Six sigma: measure hard data, look at variance instead of average.

Commonalities in error prevention strategies:

  1. acceptance of likelihood of error
  2. openness: reporting and broadcasting mistakes (airline industry: crew and ground members encouraged/required to report mistakes, protected from punishment/litigation; open source)
  3. reliance on verifiable data and verifying small seemingly straightforward aspects

These points are necessary because they are counter to our natural tendencies.

Motorola reported saving more than $17 billion from 1986--2006 due to Six Sigma.  University of Michegan implemented apologize-and-explain program, legal fees dropped to $1 million from $3 million.

Splitting lasts until about age 5 (exception: borderline personality disorder).  Then we start learn to express doubt: vocabulary ("maybe", "occasionally", "debatable", "hypothetical"), grammar (conditional, subjunctive).  These expressions of sympathy can be disarming.  Can inspire people to listen.

Listening is hard: Doctors interrupt patients 18 seconds after patient starts explaining reason for visit.  When we are aware that we could be wrong, we listen more carefully.

John Francis: 1973, took 17 year vow of silence.  He found that silence is not just "not talking", he also listened more carefully and had a more open mind because he wasn't formulating an argument against something he disagreed with.

Democracy is an error prevention strategy (has the three parts identified above).  Contrast medieval Europe: Joseph-Marie de Maistre: there is no practical difference between a leader who cannot err and a leader who cannot be accused of erring.

Jean-Jacques Rousseau: although individual rulers are falliable, "the general will cannot err." (direct election)

Thomas Jefferson: "truth is great...has nothing to fear from conflict [against error]...unless disarmed of...natural weapons, free argument and debate." (freedom of speech)

U.S. has tolerance for error built into system: laws can change, separation of powers, split of power between federal and state governments, and acceptance of plurality of political parties.

Richard Hofstadter: U.S. only stabilized when it gave up the dream of becoming a one-party utopia and accepted the existence of political opposition as crucial to maintaining democracy.

Before democracy, political opposition was seen as intrinsically subversive and illegitimate" -- something to be suppressed.

A truly open government must recognize that it might stumble into error, and won't know when it has done so.  Must rely on dissidents and whistleblowers, so must permit them to speak out without fear of reprisal.

What about when democratic governments make errors that are later seen as unconscionable?  Similar to other situations, you can't prevent all mistakes: Many hospitals have a "time-out" protocol that surgical teams review before procedure.  Beth Israel had it, and marked the patient with a magic marker, but the timeout protocol wasn't used and the surgeon didn't notice.

Descartes: Meditations on First Philosphy: Theoretically no limit to how wrong we can be, because God (later known as "Evil Genius") could deliberately deceive us about everything.  I might not have arms, legs, free will, there might not a be a sky, desk, chair, because all powerful Evil Genius could trick me about all these things.  But even Evil Genius cannot trick me into thinking that I'm thinking.  If I think I'm thinking, I'm thinking.  Thus, there must be an "I".

Contrast: Charles Peirce: we should not doubt in philosophy what we do not doubt in our hearts.

Author thinks Peirce is wrong: Doubt is act of challenging our beliefs, and we have developed formal methods for that because our hearts are bad at it.

Active investigative doubt inspires us to explore.

In addition to ackowledging mistakes to reduce error rates, there is also an intrinsic reason to embrace error: pleasure (described in next chapter)

Chapter 15: The Optimistic Meta-Induction from the History of Everything

Kathryn Shulz: in 2007, accidentally ran to and got into wrong car at gas station to get out of rain, shocking a nursing mother.  Was funny to her sister.

Wrongness can be funny: Candid Camera, linguistic errors ("did you break out in chives?").

Competing theories of humor:

  • Thomas Hobbes: superiority theory of comedy: laugh when can look down on others (Bill & Ted's Excellent Adventure, "yo momma" jokes, people who walk into clean glass doors)
  • Laugh at others not out of self-satisfaction, but out of self-recognition.  "self-improvement theory of humor": Sir Philip Sidney: Comedy should serve as an imitation of the common errors of our life.; Molière: "the duty of comedy is to correct men by amusing them."
  • We don't laugh at specific mistakes, but instead laugh at the structure of error: Incongruity theory of humor---Aristotle, Schopenhauer, Kierkegaard, Kant, Ward Jones are believers in this theory---Comedy arises from mismatch between expectation and actuality.  Attachment to belief, violation of belief (surprise, confusion), replacement of belief.  Same pattern as error.
  • Relief theory: release pent-up tension (dirty jokes, scatological humor, get giggles during stress)
  • Other theories from evolutionary psychology rooted in social hierarchies/habits of play

Bertrand Evans: The Comedy of errors is not funny by many metrics, but we laugh at the gulf between characters' understanding and ours (incongruity theory)

Other examples of incongruity theory:

  • Tartuffe, Tootsie (perhaps cross-dressing is funny because it triggers misunderstandings)
  • Discrepancy between what Schulz thought she was doing (entering sister's car) and what she was doing (entering stranger's car).
  • Optical illusions
  • Henri Bergson: A situation is comic when it can be interpreted in two entirely different meanings at the same time.
  • Who's on First
  • puns
  • misheard song lyrics
  • game of "Telephone"
  • "I've had a perfectly wonderful evening. . . but this wasn't it."

Henri Bergson: we only fail to find errors funny when we are invested in the beliefs they destroy.

fundamental gap between us and others gives us loneliness, but also comedy

Art is intrinsically inaccurate.  Plato The Republic: Ideal civilization would banish all artists because they distort truth.

  • Good: Divine concept of bed
  • Good: Physical instance of bed (necessary corruption of divine concept because you need to sleep)
  • Bad: Picture of a bed (corrupted for no practical reason)

Plato is right that art is a kind of error, but he uses the pessimistic model of wrongness.  It is a form of creating an ideal society by erradicating all forms of error.

Optimistic model: legitimacy and potential beauty in individual skewed inaccurate representations of reality.  Art is error, which doesn't mean that art is bad, but that error is good.  Capacity to err is one and same as imagination.  An owl or a lion can pounce at the wrong time and a computer can make an error, but none can imagine things that don't exist.  They can't feel wrong/berate themselves/felt defensive either.

Western artists previously wanted to represent the world as accurately as possible, but modernism now tries to expose and explore the fact that art is a skewed representation of reality. (Gertrude Stein, Picasso). Tristan Tzara: "Let us try for once not to be right."

Art is intentional journey into error, and artists are unusually aware of this relationship.

John Keats: "and at once it struck me, what quality went to form a Man of Achievement especially in literature & which Shakespeare possessed so enormously---I mean Negative Capability, that is when man is capable of being in uncertainties, Mysteries, doubts without any irritable reaching after fact & reason."

"art" is related to "artificial".

Anne Carson (wrote Essay on What I Think About Most): What we are engaged in when we do poetry, is error.

suspension of disbelief: we enjoy suspense in art, but not in real life.  Examples of intentional deception: The Murder of Roger Ackroyd, Our Mutual Friend, Pride and Prejudice, The Usual Suspects

Usually being lost/confused/in the dark is temporary and is resolved.  In modern art, the feeling is ongoing.

We enjoy believing other people's beliefs in fiction, in contrast with real life.

Art is an exercise in empathy -- we see the world through someone else's eyes (and briefly bridge the fundamental chasm between minds).

Embracing error can do the same thing.

Wrongness:Art::Wrongness:Science -- art explores how humans get things wrong.  science tries to come up with a picture of reality that takes human (perception and feelings) out of the picture.

Only humans have the ability to err.

Our individual mistakes are a fingerprint of our identity.

Schulz claims that error also enables evolution and survival of species

Studies: People who suffer from depression have more accurate worldviews than nondepressed people.  depressive realism.  Ronnie Janoff-Bulman: "a key to the good life might well be illusions at our deepest, most generalized level of assumptions and accuracy at the most specific, least abstract levels."  Don Quixote is happy when he has an inaccurate world view.  Michel Foucault: Quixote represents an extreme version of all of us---many of us think we are above average, better looking, younger, than we are and that our loved ones are lovelier than they are.

Schulz talks about aiming higher than realistic as a form of being wrong, and thus wrongness as inspiring us to undertake things.

Subscribe to All Posts - Wesley Tanaka