Everybody will tell you that memory can't be trusted. When they say that, of course, what they mean is other people's memories can't be trusted. We don't like to think that everything we know about the world is based on a deeply flawed and illogical storage system.
We're not talking about being bad at matching faces with names here. Science has found that your memory is basically a pathological liar, just making it up as it goes along. For instance ...
5Other People Can Manipulate Your Memory With Repetition
There was quite a stir recently when it turned out that a growing number of people believe the President of the USA is a Muslim. Regardless of whether or not you intend to vote for the man, this is just an issue of fact, and the fact is that at various times we have all seen video clips of Mr. Obama drinking alcohol, eating pork, getting sworn in on a Christian Bible and sitting in a Christian church.
But according to the Pew Research Center, for almost 20% of the people they polled, those memories have been trumped by the mere act of hearing commentators assert that Obama is a Muslim, over and over and over.
Obama, posing with a statue of the famed Imam Ali bin Superman.
You can laugh at them all you want, but that technique works on all of us, to various degrees. Nobody likes to think of themselves as susceptible to advertisements, or propaganda, or liars. Too bad. It's just part of the mechanical workings of our brain: when we hear a statement enough, we'll start to believe it.
They call it the "Illusion of Truth" effect. We judge things to be true based on how often we hear them. We like familiarity, and repeating a lie often enough makes it familiar to us, the repetition making it fall right in with all of the things our memory tells us are true about the world. Every advertiser or propagandist knows this. Humans are social animals, and there is a primal part of us that still says, "If other members of the tribe who I feel close to believe this, there must be something to it."
"We will never regret any of these decisions."
And no, simply showing us the correct information doesn't fix it. Quite the opposite: research shows that once we've seized on an incorrect piece of information, exposure to the facts either doesn't change what we think, or makes us even more likely to hold onto the false information. You can guess why this is: our self-image triumphs over all. It's more important that we continue to think of ourselves as infallible than admit we're wrong. This is how people continue to believe admitted hoaxes after they have been proven to be fake.
"Who would fake something like that?"
But wait, here's the best part:
Most of you will still think of this as something other people do, and that you of course are the unbiased observer who can clearly see their stupidity. There is a reason for this, too. They call it the Bias Blind Spot. The biases in your system cripple even your ability to examine your own biases. So just now, when you thought to yourself, "Ha, I've caught myself doing that! But at least I'm not as nutty as those 'Obama is a Muslim' nutjobs!", you just saw your own bias at work. You're trying to examine a broken mechanism with a broken mechanism. It's like trying to perform surgery on your own ass, with a scalpel that is itself clenched in your ass.
"So we're out of gloves..."