![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
A quick summary. Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts is about cognitive dissonance: the uncomfortable-at-best feeling you get when things you do, or things that happen, contradict your beliefs -- about yourself or the world. It's about the unconscious justifications, rationalizations, and other defense mechanisms we use to keep that dissonance at bay. It's about the ways that these rationalizations perpetuate and entrench themselves. And it's about some of the ways we may be able to derail them. The book is fascinating and readable; it's clear, well-written, well-researched, loaded with examples, and often very funny.

"I couldn't help it." "Everyone else does it." "It's not that big a deal." "I was tired/sick." "They made me do it." "I'm sure it'll work out in the long run." "I work hard, I deserve this." "History will prove me right." "I can accept money and gifts and still be impartial." "Actually, spending fifty thousand dollars on a car makes a lot of sense." "When the Leader said the world was going to end on August 22, 1997, he was just speaking metaphorically."
In fact, we have entire social structures based on supporting and perpetuating each other's rationalizations -- from patriotic fervor in wartime to religion and religious apologetics.
I could summarize the book ad nauseum, and this could easily turn into a 5,000 word book review. But I do have my own actual points to make. So here are, IMO, the most important pieces of info to take from this book
1) This process is unconscious. It's incredibly easy to see when someone else is rationalizing a bad decision. It's incredibly difficult to see when we're doing it ourselves. The whole way that this process works hinges on it being unconscious -- if we were conscious of it, it wouldn't work.
2) This process is universal. All human beings do it. In fact, all human beings do it pretty much every day. Every time we take a pen from work and think, "Oh everyone does it, and the company can afford it"; every time we light a cigarette after deciding to quit and think, "Well, I only smoke half a pack a day, that's not going to kill me"; every time we eat a pint of Ben and Jerry's for dinner and think, "It's been a long week, I deserve this"; every time we buy consumer products made in China (i.e., by slave labor) and think, "I really need new sneakers, and I just can't afford to buy union-made"... that's rationalization in action. It is a basic part of human mental functioning. If you think you're immune... I'm sorry to break this to you, but you're mistaken. (See #1 above, re: this process being unconscious, and very hard to detect when we're in the middle of it.)

This is probably the scariest part of the book. When we hurt someone and convince ourselves that they deserved it, we're more likely to hurt them -- or other people like them -- again. Partly because we've already convinced ourselves that they're bad, so why not... but also, in large part, to bolster our belief that our original decision was right.
The most chilling examples of this are in the justice system and international relations. In the justice system, cops and prosecutors are powerfully resistant to the idea that they might have made a mistake and put the wrong person in prison. As a result, they actively resist revisiting cases, even when new evidence turns up. And the justice system is, in far too many ways, structured to support this pattern.
As for this process playing out in international relations, I have just three words: "The Middle East." Any time you have a decades- or centuries-old "they started it" vendetta, you probably have one of these self-perpetuating rationalization processes on your hands. On all sides.
But this happens on a small scale as well, with individuals. I know that I've said snarky, mean things behind people's backs, for no good reason other than that friends of mine didn't like them and were being mean and snarky about them... and I've then convinced myself that I really couldn't stand that person, and gone on to say even more mean things about them. And I've more than once tried to convince my friends to dislike the people that I disliked... because if my friends liked them, it was harder to convince myself that my dislike was objectively right and true. All unconsciously, of course. It's taken time and perspective to see that that's what I was doing.
4) The more we have at stake in a decision, the harder we hang on to our rationalization for it.
This is a freaky paradox, but it makes a terrible kind of sense when you think about it. The further along we've gone with a bad decision, and the more we've committed to it, the more likely we are to justify it -- and to stick with it, and to invest in it even more heavily.
A perfect example of this is end-of-the-world cults. When people quit their jobs and sell their houses to follow some millennial leader, they're more likely to hang on to their beliefs, even though the world conspicuously did not end on August 22, 1997 like they thought it would. If someone doesn't sell their house to prepare for the end of the world -- if, say, they just take a week off work -- they'll find it easier to admit that they made a mistake.
And this is true, not just for bad decisions and mistaken beliefs, but immoral acts as well. Paradoxically, the worse the thing is that you've done, the more likely you are to rationalize it, and to stick to your rationalization like glue. As I wrote before when I mentioned this book: It's relatively easy to reconcile your belief that you're a good person with the fact that you sometimes make needlessly catty remarks and forget your friends' birthdays. It's a lot harder to reconcile your belief that you're a good person with the fact that you carved up a pregnant woman and smeared her blood on the front door. The more appalling your immoral act was, the more likely you are to have a rock-solid justification for it... or a justification that you think is rock-solid, even if everyone around you thinks it's transparently self-serving or batshit loony.
5) This process is necessary.
This may be the hardest part of all this to grasp. As soon as you start learning about the unconscious rationalization of cognitive dissonance, you start wanting to take an icepick and dig out the part of your brain that's responsible for it.
But in fact, rationalization exists for a reason. It enables us to make decisions without being paralyzed about every possible consequence. It enables us to have confidence and self-esteem, even though we've made mistakes in the past. And it enables us to live with ourselves. Without it, we'd be paralyzed with guilt and shame and self-doubt. Perpetually. We'd never sleep. We'd be second-guessing everything we do. We'd be having dark nights of the soul every night of our lives.
So that's the gist of the book. Cognitive dissonance, and the unconscious rationalizations and justifications we come up with to deal with it, are a basic part of human consciousness. It's a necessary process... but it also does harm, sometimes great harm. So we need to come up with ways, both individually and institutionally, to minimize the harm that it does. And since the process is harder to stop the farther along it's gone, we need to find ways to catch it early.
It's important because, in a very practical and down-to-earth way, this concept gives us a partial handle on why dumb mistakes, absurd beliefs, and harmful acts get perpetuated. And it gives us -- again, in a very practical, down-to-earth way -- a handle on what we can do about it.
REst here:http://gretachristina.typepad.com/greta_christinas_weblog/2008/01/mistakes-were-1.html