Rationality: an analogy
Almost everyone thinks they're a good driver. Ask them, they'll tell you. Here's a survey which did just that. Most Americans (and, I presume, most people in other countries as well) believe themselves to be "excellent" or "very good" drivers.
But if you start asking specific questions, you'll get a different picture:
- Do I speed?
- Do I tailgate? (i.e. Do I obey the two-second rule?)
- Do I run red lights? (i.e. Am I in the intersection when my light is red?)
- Do I "roll through" stop signs?
- Do I text (or Tweet, or Facebook, or email, or Google, or …) or use my phone while driving?
- Do I drive while excessively tired or sleepy?
- Do I drive drunk, high, or "buzzed"?
- Do I try to avoid distractions (looking for CDs, eating, messing with the radio, etc)?
- Do I properly use seat belts and child seats?
I'm sure there are many more I could add. The answers to these questions will tell a whole other story about many of those who rate themselves to be "good" drivers. If I do any of these, I'm not being a very good driver (while I do them, at least).
Oh, and most people who rate themselves highly also rate other drivers as "average" or "poor". Of course, actual good drivers can honestly rate themselves highly and honestly rate others comparatively poorly, but obviously there are a lot of bad drivers who do it too.
The problem is that many people define "bad driver" as "gets in my way" and/or "I see them doing something dumb". By definition, then, they can't rate themselves as bad drivers, even if they do tons of dangerous driving habits.
Due to the certain biases, people tend to make exceptions for themselves and not for others. They understand their own motivations, but not those of other people, so when I run the light, it's because I am running late and was already waiting a long time. When other people run that light in front of me, it's because they're idiots or bad people. When I miss something obvious and narrowly avoid colliding with the car in front, it was just a momentary distraction and just a one time thing, and I managed to stop in time so no big deal. When the car behind me almost crashes into me at a stop light, he/she is a bad driver, plain and simple. What I was doing wasn't very dangerous… etc.
I don't exclude myself: I do try to be extremely conscious of avoiding these driving sins, but I often find myself distracted and end up jerking the wheel back as I approach the side of the road, or I'll be slow at noticing something I really should have noticed sooner (brake lights in front of me, etc). Occasionally I'll answer the phone if it's my wife calling. And probably some other things I wrongly think aren't important or things I don't even notice because I was not paying attention.
Similarly, almost everyone thinks they're rational. Ask them, they'll tell you. They have good reasons for their beliefs (unlike those other people). They diligently educate and inform themselves (unlike those other people). They would certainly never believe things due to failures of rationality (unlike those other people).
But again, if you start examining some specifics, you get a different picture. Evaluating specific failures in rationality is a much more difficult task than evaluating specific bad driving habits, of course. Reflecting on why I make certain decisions is subject to the same biases and rationality failure as making the decision itself. But there are a few things you can think about.
- Do I repeat weak arguments for my beliefs?
- Do I critically analyze arguments against my beliefs, but arguments for my belief get a free pass?
- Do I pay attention only on evidence which confirms my beliefs, ignoring any that doesn't?
- Do I understand statistics?
- Am I often certain of things which later turn out to be false?
- Do I confuse mere correlation with causation?
- Do I rely on anecdotes, even when real scientific work probably has better answers?
I could probably add dozens more, or hundreds if we started naming specific examples like:
- Do I worry more about extremely small (but scary) risks (kidnapping, school shootings, terrorism, vaccine injury) than much more common risks (heart disease, cancer, stroke, accidents, driving)?
If we brought in politics, this list would be too long to print.
Really, there are whole lists of documented and studied cognitive biases that affect our behavior, decisions, and beliefs.
Just like with driving, most people think they are rational. After all, we have good reasons for our actions and beliefs. Other people? Not so much. Everyone seems to implicitly define "rational" as "agrees with me", so of course other people who have reached different conclusions aren't being rational.
Due to things like bias blind spot, we tend to rationalize our own beliefs and we think our ideas have a much better rational basis than they do. Then the fundamental attribution error causes us to ignore the possible rational reasons other people believe what they believe. Just like with driving: I know what I'm doing when I roll through a stop sign; that other guy who went through a sign and almost hit me was just being dangerous.
Once more, I don't exclude myself. I often find myself making important decisions based on less-than-rational arguments. I find myself repeating dumb arguments (especially in the political world) and dismissing possible lines of thought because they came from the "wrong" source (especially in the political world), or I didn't like the conclusion. I find myself looking for arguments to reach my predetermined decision: rationalizations.
I am trying. I'm trying not to repeat bad arguments, even if they support my side. I'm trying not to let anecdotes (unduly) influence my choices when there's better evidence available, even if I don't like it as much. I'm trying not to make assumptions about people based on limited data. I am trying to not be so certain of my previous conclusions. etc. But, I fail at that quite a bit, and probably a lot more than I know.
I guess what I'm saying is irrationality is a problem we all have: our brains mess up all the time. It's easy to point out the problems in other people's beliefs (and even easier to make up problems in other people's beliefs), but we'll make very little effort into debunking our own.