5 Psychological Flaws That Warp the Way You See the World
Whoops. You didn't mean to click on this link, did you? You wanted to read the article full of badass mind-blowing facts about badminded ass-blowers, didn't you? So why did you make that mistake? Come to think of it, why do you make mistakes in general? You're a grown-up, with tons of experience getting through the day without shitting your pants. And yet you (and all of us) still make mistakes, even when doing really familiar things.
Like keeping our pants unsoiled.
It turns out that there are some very common patterns that occur when we make mistakes, well-known cognitive speed bumps that our brains can't help but trip over.
Here's 12 of them.
You See Less Than You Think
Do you have any idea how your eye works? You probably think it's like a camera, right? Light comes in, bounces around on lenses and mirrors and such, and then hey, presto, an image appears in your brain, just like a security camera.
Or a toilet cam.
But it's not like that at all, or rather, it's like that if your eye is the shittiest video camera in the world. Because while a video camera can keep a pretty broad field of view entirely in focus, your eye can't. Only a small fraction of what you see is in focus at any given time -- the light that's cast on the small part of the retina called the fovea. Everything else is a blurry mess, with little way to differentiate between a bear and a particularly ugly couch.
Or other things.
Yet despite that, we all still have a pretty good mental image of what's happening in front of us at all times. How? Basically, the eye focuses on what it needs to, and our brains fill in the blanks on the rest. Often as not, this prediction is perfectly accurate, and based on things the eye has recently seen. But sometimes it isn't. Sometimes your brain messes this up pretty badly. This is essentially how most optical illusions work, our brain trying to build its own picture of a scene based on its longstanding rules about how the world works, only to utterly cock up its job of accurately displaying what's in front of us.
And that's just at the simple low level of actually seeing things. At a higher level, we make even weirder mistakes. This video provides a pretty tidy demonstration.
For the video-impaired, the deal is that a researcher approaches a subject on a campus and asks for directions. "Where them college girls at?" let's assume. As the subject is giving directions to Sorority Row, two more people approach carrying a door and step between the two, at which point the researcher swaps places with one of the people carrying the door. The subject continues giving directions to the new person without noticing the switch. The video claims that about 50 percent of the subjects they tried this on didn't notice a thing.
How did that happen? The best theory seems to be that once the subject realizes they're dealing with a person they'll never see again, they stop paying attention to their face. The subject mentally files this person into a bucket called "Dude looking for some college tail," and that bucket doesn't need any facial features, because the subject's never going to deal with them again. Their eyes might have seen it, but the brain didn't remember it, because it assumed it didn't have to. It's this phenomenon, among other things, that's responsible for the frighteningly low detection rate for spousal haircuts.
She's basically just a noisy blur that keeps reminding him of chores.
You're Full of Hidden Biases
Now, by bias, I don't mean how you assume that all woman love fried chicken, or that every Asian person likes going to the bathroom in groups.
Or that you think all black people are deadly martial artists.
I'm more talking about cognitive biases, which are well-known and studied mistakes that people make when thinking. We've talked about these before, and you'll of course be familiar with confirmation bias if you've read anything to do with politics, ever. But there's so much more than that, like these couple hundred named, researched cognitive biases. Like, let's say the gambler's fallacy, the tendency we have to think that past events can influence future (independent) events. Like flipping a coin and getting five heads in a row, and thus thinking tails is "due."
"Five thousand on tails!"
Or the hindsight bias, which is the tendency we have to view past (utterly random) events as if they were predictable.
"Black. Of course it was going to be black. It's always black when I bet tails."
"Sir, your bet?"
"Five thousand on redheads!"
Also, the identifiable victim effect, which causes us to respond stronger to a crime when it has a single, identifiable victim instead of a larger, faceless group.
"Reports indicate that the victim was badly beaten by a cocktail waitress who had mistakenly assumed he was soliciting sex from her."
Basically, no matter what job you give your brain to do, as soon as you turn your back, that gold-bricking asshole immediately starts taking shortcuts.
You Keep Building Oversimplified Models
You "know" how the real world works. We all essentially have a little model of the entire universe spinning around in our heads. And sure, it's not complete, because we haven't touched or stroked the entire universe yet, but the parts that we have seen are pretty solid.
Well, let's examine a small, moist part of that universe and venture into the fetid swamp of video game forums. On just about every video game forum or blog, you will quickly find someone arguing essentially this:
"Sony/Microsoft really shat the bed this time with their latest move with the PlayStation/Xbox. Look! Their stock price lost XX points today!"
This makes sense if you know a little bit about the stock market, like the fact that there is such a thing as a stock market and that people in shirts work there. Companies that do well have their stock prices go up, and vice versa. Easy, right?
Everyone in this picture is a simpleton.
But if you happen to know just a leeetle bit more about the stock market, you'll realize that no, that argument doesn't make a bit of sense. Sony and Microsoft are huge companies, and their respective video game divisions make up a tiny fraction of each company's revenue. The stock market honestly doesn't care much at all what happens with the Xbox, so long as Microsoft Office sales are still fine.
Let's go back to that little model of the universe we all have. The problem is, for the parts of the universe we haven't learned about yet, there isn't a blank spot there, with maybe a little note that says "read a fucking book, moron." No, it's worse than that. What we'll find there is something that looks exactly like the rest of the model, except it's based on our brain's wild-ass guessing.
We oversimplify when dealing with people, too. The fundamental attribution error occurs when we decide that someone's behavior is driven by their personality, rather than external factors. A co-worker didn't return your calls not because he was swamped with other work, he did it because he hates you. The cashier at the fast food restaurant isn't moving slowly because it's her first day. No, it's because her parents were cousins.
He didn't cut you off because he couldn't see you and you were speeding yourself. He cut you off because he's a cat fucker.
You Don't Learn from Your Mistakes
Sometimes we make mistakes.
But that's cool. After all, we have to fail to get better, right? We can all think of times in our lives when we were new and sucky at something and kept screwing up, but then slowly and slowly got better at it. Inspirational music may have been playing. There may have been some splashing around on the beach in some tiny clothes, too.
Regardless of how much homoerotic clutching is in your personal success story, the lesson was that it was the failures that made us better.
Except no, no, no they didn't. We learn far more from success than we do failure. MIT scientists hooked up something to a monkey's brain, a laser probably, and then watched as the monkey tried various activities successfully and unsuccessfully. They found that every success registered brain activity that influenced later attempts in a way that failures didn't. Failures didn't seem to do anything. Doing something right made it far easier for the monkey to do it right again in the future.
So what then to make of the old advice? Well, I'd suggest it might better be rephrased "We learn by trying again." (Assuming you have the common sense to not fail the same way twice.)
Success! No, wait ...
At a higher level, we can look at another one of those pesky cognitive biases, in this case the choice-supportive bias. This is what happens when you make a choice and then immediately seek out justifications that you made the right choice. Evidence that supports the choice is promoted over evidence that suggests otherwise. (If you've ever met anyone who bought a Mac, you'll have seen this one in action.) What makes this so insidious is that in cases where the choice was actually and objectively a mistake, it's incredibly difficult to figure that out, because your own damned brain will fight like hell to prevent you from seeing it.
If you've ever met anyone who installed Linux, you'll have seen this, too.
And yet, despite all of the preposterous shortcuts our brains take, usually without telling us, our brains don't seem to lack for confidence. We probably even have an excess of it. For example, when people say they're 100 percent certain of something, there's probably about an 80 percent chance they're right. Ninety-three percent of us seem to think we're above-average drivers. And apparently 84 percent of all Frenchmen consider themselves above-average lovers.
Which is statistically unlikely, but scientists have been unable to attach enough lasers to French women's vaginas to be sure.
We all seem to think we're above average at everything we do, and bizarrely, we grow even more confident in our abilities the less we know about something. As Charles Darwin, a man made famous for killing God, and certainly no stranger to confidence, once stated:
"Ignorance more frequently begets confidence than does knowledge."
More formally, this is called the Dunning-Kruger effect. There are a few possible causes for this, but basically, if you're doing something that you're totally incompetent at, you might be so bad at it that you don't even know you're doing it wrong. The decision-making processes you use to answer a question are the same processes that you'd use to evaluate whether it's right or not. We don't just not see our blind spots, we don't know they exist. It's yet another way we don't learn from our mistakes.
But even the ultra-competent can be prone to overconfidence. Consider the case of surgeons, who are by most measures above average at most of the things they do.
But not, sadly, at lovemaking.
But surgeons are still human and inevitably make mistakes. Instruments get left in patients, surgeries get started without the right supplies, leg bones get attached to the neck bone. It's a serious problem, and people have, haha, actually died because of it.
The instruments left in patients thing. Not this. That was actually a joke.
A really easy way of minimizing these avoidable errors is by using checklists. Is the patient here? Check. Are there enough leeches? Check. That kind of thing. Basically, pages and pages of utterly obvious bookkeeping, the kind of stuff any expert already knows how to do. Boring though they may be, the airline industry uses checklists for everything, and it's one of the reasons why airlines have such preposterously good safety records.
And yet when a surgeon tried to implement checklists in various hospitals, he encountered significant resistance among the surgeons he suggested it to. They considered checklists pedantic and insulting, a waste of their time, full of things they already knew how to do. The very idea that they could forget the 18th step in a 57-step surgery, a pretty plausible-sounding error that happens all the time, was unthinkable to them.
"I did not go to school for eight years to be treated like I was fallible. NOW KISS MY GOD DAMNED RING."
And so there you have it. Dumb or smart, you're basically doomed to fuck up at everything you do. The only solution is apparently to use a checklist, which is boring and insulting and something you're not gonna want to do.
Hell, for all I know, you'll probably just eat the fucking thing.
Chris Bucholz is a Cracked columnist and your best friend. Join him on Facebook or Twitter and make him reconsider that.