Excerpts from Chapter VIII (Inner Demons: The Moralization Gap and The Myth of Pure Evil) from The Better Angels of Our Nature by Steven Pinker

I [have] argued that the modern denial of the dark side of human nature — the doctrine of the Noble Savage — was a reaction against the romantic militarism, hydraulic theories of aggression, and glorification of struggle and strife that had been popular in the late 19th and early 20th centuries. Scientists and scholars who question the modern doctrine have been accused of justifying violence and have been subjected to vilification, blood libel, and physical assault. The Noble Savage myth appears to be another instance of an antiviolence movement leaving a cultural legacy of propriety and taboo.

…[In his book, Evil, Roy] Baumeister was moved to study the commonsense understanding of evil when he noticed that the people who perpetrate destructive acts, from everyday peccadilloes to serial murders and genocides, never think they are doing anything wrong. How can there be so much evil in the world with so few evil people doing it?

…Baumeister and his collaborators Arlene Stillwell and Sara Wotman couldn’t very well get people to commit atrocities in the lab, but they reasoned that everyday life has its share of smaller hurts that they could put under the microscope. They asked people to describe one incident in which someone angered them, and one incident in which they angered someone. The order of the two questions was randomly flipped from one participant to the next, and they were separated by a busywork task so the participants wouldn’t answer them in quick succession. Most people get angry at least once a week, and nearly everyone gets angry at least once a month, so there was no shortage of material. Both perpetrators and victims recounted plenty of lies, broken promises, violated rules and obligations, betrayed secrets, unfair acts, and conflicts over money.

But that was all that the perpetrators and victims agreed on. The psychologists pored over the narratives and coded features such as the time span of the events, the culpability of each side, the perpetrator’s motive, and the aftermath of the harm. If one were to weave a composite out of their tallies, they might look something like this:

The Perpetrator’s Narrative: The story begins with the harmful act. At the time I had good reasons for doing it. Perhaps I was responding to an immediate provocation. Or I was just reacting to the situation in a way that any reasonable person would. I had a perfect right to do what I did, and it’s unfair to blame me for it. The harm was minor, and easily repaired, and I apologized. It’s time to get over it, put it behind us, let bygones be bygones.
The Victim’s Narrative: The story begins long before the harmful act, which was just the latest incident in a long history of mistreatment. The perpetrator’s actions were incoherent, senseless, incomprehensible. Either that or he was an abnormal sadist, motivated only by a desire to see me suffer, though I was completely innocent. The harm he did is grievous and irreparable, with effects that will last forever. None of us should ever forget it.

They can’t both be right — or more to the point, neither of them can be right all of the time, since the same participants provided a story in which they were the victim and a story in which they were the perpetrator. Something in human psychology distorts our interpretation and memory of harmful events.

This raises an obvious question. Does our inner perpetrator whitewash our crimes in a campaign to exonerate ourselves? Or does our inner victim nurse our grievances in a campaign to claim the world’s sympathy? Since the psychologists were not flies on the wall at the time of the actual incidents, they had no way of knowing whose retrospective accounts should be trusted.

In an ingenious follow-up, Stillwell and Baumeister controlled the event by writing an ambiguous story in which one college roommate offers to help another with some coursework but reneges for a number of reasons, which leads the student to receive a low grade for the course, change his or her major, and switch to another university. The participants (students themselves) simply had to read the story and then retell it as accurately as possible in the first person, half of them taking the perspective of the perpetrator and half the perspective of the victim. A third group was asked to retell the story in the third person; the details they provided or omitted serve as a baseline for ordinary distortions of human memory that are unaffected by self-serving biases. The psychologists coded the narratives for missing or embellished details that would make either the perpetrator or the victim look better.

The answer to the question “Who should we believe?” turned out to be: neither. Compared to the benchmark of the story itself, and to the recall of the disinterested third-person narrators, both victims and perpetrators distorted the stories to the same extent but in opposite directions, each omitting or embellishing details in a way that made the actions of their character look more reasonable and the other’s less reasonable. Remarkably, nothing was at stake in the exercise. Not only had the participants not taken part in the events, but they were not asked to sympathize with the character or to justify anyone’s behavior, just to read and remember the story from a first-person perspective. That was all it took to recruit their cognitive processes to the cause of self-serving propaganda.

…The Moralization Gap is a part of a larger phenomenon called self-serving biases. People try to look good. “Good” can mean effective, potent, desirable, and competent, or it can mean virtuous, honest, generous, and altruistic. The drive to present the self in a positive light was one of the major findings of 20th-century social psychology…Among the signature phenomena are cognitive dissonance, in which people change their evaluation of something they have been manipulated into doing to preserve the impression that they are in control of their actions, and the Lake Wobegon Effect (named after Garrison Keillor’s fictitious town in which all the children are above average), in which a majority of people rate themselves above average in every desirable talent or trait.

…The problem with trying to convey an exaggerated impression of kindness and skill is that other people are bound to develop the ability to see through it, setting in motion a psychological arms race between better liars and better lie detection…[Robert] Trivers ventured that natural selection may have favored a degree of self-deception…[Thus meaning,] We lie to ourselves so that we’re more believable when we lie to others. At the same time, an unconscious part of the mind registers the truth about our abilities so that we don’t get too far out of touch with reality. Trivers credits George Orwell with an earlier formulation of the idea: “The secret of rulership is to combine a belief in one’s own infallibility with a power to learn from past mistakes.”

Self-deception is an exotic theory, because it makes the paradoxical claim that something called “the self” can be both deceiver and deceived. It’s easy enough to show that people are liable to self-serving biases, like a butcher’s scale that has been miscalibrated in the butcher’s favor. But it’s not so easy to show that people are liable to self-deception, the psychological equivalent of the dual books kept by shady businesses in which a public ledger is made available to prying eyes and a private ledger with the correct information is used to run he business.

A pair of social psychologists, Piercarlo Valdesolo and David DeSteno, have devised an ingenious experiment that catches people in the act of true, dual-book self-deception. They asked the participants to cooperate with them in planning and evaluating a study in which half of them would get a pleasant and easy task, namely looking through photographs for ten minutes, and half would get a tedious and difficult one, namely solving math problems for forty-five minutes. They told the participants that they were being run in pairs, but that the experimenters had not yet settled on the best way to decide who got which task. So they allowed each participant to choose one of two methods to decide who would get the pleasant task and who would get the unpleasant one. The participants could just choose the easy task for themselves, or they could use a random number generator to decide who got which. Human selfishness being what it is, almost everyone kept the pleasant task for themselves. Later they were given an anonymous questionnaire to evaluate the experiment which unobtrusively slipped in a question about whether the participants thought that their decision had been fair. Human hypocrisy being what it is, most of them said it was. Then the experimenters described the selfish choice to another group of participants and asked them how fairly the selfish subject acted. Not surprisingly, they didn’t think it was fair at all. The difference between the way people judge other people’s behavior and the way they judge their own behavior is a classic instance of a self-serving bias.

But now comes the key question. Did the self-servers really, deep down, believe that they were acting fairly? Or did the conscious spin doctor in their brains just say that, while the unconscious reality-checker registered the truth? To find out, the psychologists tied up the conscious mind by forcing a group of participants to keep seven digits in memory while they evaluated the experiment, including the judgment about whether they (or others) had acted fairly. With the conscious mind distracted, the terrible truth came out: the participants judged themselves as harshly as they judged other people. This vindicates Trivers’s theory that the truth was in there all along.

…Though acknowledging a compromising truth about ourselves is among our most painful experiences…it is, at least in principle, possible. It may take ridicule, it may take argument, it may take time, it may take being distracted, but people have the means to recognize that they are not always in the right. Still, we shouldn’t deceive ourselves about self-deception. In the absence of these puncturings, the overwhelming tendency is for people to misjudge the harmful acts they have perpetrated or experienced.

The full book can be purchased here.