Everybody will tell you that memory can't be trusted. When they say that, of course, what they mean is other people's memories can't be trusted. We don't like to think that everything we know about the world is based on a deeply flawed and illogical storage system.
We're not talking about being bad at matching faces with names here. Science has found that your memory is basically a pathological liar, just making it up as it goes along. For instance ...
5Other People Can Manipulate Your Memory With Repetition
There was quite a stir recently when it turned out that a growing number of people believe the President of the USA is a Muslim. Regardless of whether or not you intend to vote for the man, this is just an issue of fact, and the fact is that at various times we have all seen video clips of Mr. Obama drinking alcohol, eating pork, getting sworn in on a Christian Bible and sitting in a Christian church.
But according to the Pew Research Center, for almost 20% of the people they polled, those memories have been trumped by the mere act of hearing commentators assert that Obama is a Muslim, over and over and over.
Obama, posing with a statue of the famed Imam Ali bin Superman.
You can laugh at them all you want, but that technique works on all of us, to various degrees. Nobody likes to think of themselves as susceptible to advertisements, or propaganda, or liars. Too bad. It's just part of the mechanical workings of our brain: when we hear a statement enough, we'll start to believe it.
They call it the "Illusion of Truth" effect. We judge things to be true based on how often we hear them. We like familiarity, and repeating a lie often enough makes it familiar to us, the repetition making it fall right in with all of the things our memory tells us are true about the world. Every advertiser or propagandist knows this. Humans are social animals, and there is a primal part of us that still says, "If other members of the tribe who I feel close to believe this, there must be something to it."
"We will never regret any of these decisions."
And no, simply showing us the correct information doesn't fix it. Quite the opposite: research shows that once we've seized on an incorrect piece of information, exposure to the facts either doesn't change what we think, or makes us even more likely to hold onto the false information. You can guess why this is: our self-image triumphs over all. It's more important that we continue to think of ourselves as infallible than admit we're wrong. This is how people continue to believe admitted hoaxes after they have been proven to be fake.
"Who would fake something like that?"
But wait, here's the best part:
Most of you will still think of this as something other people do, and that you of course are the unbiased observer who can clearly see their stupidity. There is a reason for this, too. They call it the Bias Blind Spot. The biases in your system cripple even your ability to examine your own biases. So just now, when you thought to yourself, "Ha, I've caught myself doing that! But at least I'm not as nutty as those 'Obama is a Muslim' nutjobs!", you just saw your own bias at work. You're trying to examine a broken mechanism with a broken mechanism. It's like trying to perform surgery on your own ass, with a scalpel that is itself clenched in your ass.
"So we're out of gloves..."
4Your Brain is Half-Blind
Most people seem to think of the brain as an incredibly complex machine that can do amazing things, but, at least when it comes to processing visual information, your brain is actually quite lazy, filling in what you are seeing with generic information it figures is probably there. This half-assed method of construction is known, in technical terms, as the Teamster approach . The best and most ridiculous example of this comes from the Invisible Gorilla study:
In the study volunteers were asked to watch the above video of two basketball teams and count how many passes there were. Try it.
During the video a person in a gorilla costume walks across the court. Half the people who watch that video don't notice the gorilla. All of them saw it, but they didn't know they had seen it. When they watched the tape again after being told there was a gorilla they all saw it, but still had no recollection of seeing it before. Because we are told to focus on the ball, our brain immediately makes assumptions about everything else in the scene and lazily fills it in (in this case, it assumes an empty, gorilla-free room), whether it's accurate or not.
Likewise, when you walk into an office, you will notice the hot receptionist, but you won't notice what her phone looks like, what color her chair is, or the fact that she has twenty glass cat figurines displayed on her desk. You saw all of that, in the sense that the light reflecting off all of those objects hit your eye, but without focusing on it you won't actually remember any of it. If pressed to remember it later, you'll just fill in generic images.
"...there was a pretty neat lamp in the corner?"
What is surprising about the above experiment was that even when those unnoticed details contained something unexpected, striking or even shocking (such as a rogue gorilla), your brain still just smoothed right over it. "Nothing to see here!"
So take a moment and wonder how many of your life's most striking or world-changing sights have fallen into this black hole of inattention.
"Holy shit, someone dropped a quarter!"