Humans are terrible with numbers. They just don't fit in our brains. It's why scientists can hammer us with statistics about global warming but we will stop believing in it as soon as it gets cold where we live. It's not our fault -- the human brain just isn't built for this, and quite frankly, we don't really need to process huge numbers to get by in our everyday lives. You don't have to understand long term data trends in order to change a goddamn light bulb.
But there are some basics that everyone should know. Each of them sounds incredibly simple when it's explained, yet each of them will fool you again within days of reading this article. So try to keep in mind ...
5What We Call "Average" Actually Isn't
Sounds Like ...
Here's a shocking statistic: The average income in the United States is around $70,000. If your income is below that level, reading that is quite a kick. You thought you were actually doing pretty well for yourself, but now you're tempted to get a second job just to bump your net worth up to what the asshole next door is probably making. What's their secret, damn it? Are they all cooking meth?
"What the hell else are garages for?"
The Problem Is ...
The popular use of the term "average" is way different from the mathematical term, but they get used interchangeably. That's why we're so often shocked at how the "average" person is richer/fatter/taller than us. In everyday language, we use the word "average" to mean "most people," or the most representative person (as in, "The average person doesn't read classic literature" or "The average Joe can't afford to dress like Prince"). But then when they start using the word "average" to talk about statistics, you get weird results, like the fact that 67 percent of people in the USA make less than the "average" income. So how the fuck can "average" mean "most people" when most people aren't average?
Well, we all learned in school how to calculate an average: You take all the values you're averaging, add them up, and divide them by the number of values. This is fine if what you're trying to average is pretty uniform -- the average of 1, 2, 3, 4, and 5 is 3, right there in the middle. The problem is that averages are absolutely useless if a minority of numbers are unusually high -- the average of 1, 2, 3, 4, and 40 is 10, which doesn't help anybody know shit about anything.
"Unfair? That averages out to be 12 each!"
And that's the problem with the "average income" statistic -- a few rich people are skewing the shit out of the number. If you're earning less than the average income, it's not because your job is screwing you, it's because you live in the same country as Bill Gates, Mark Zuckerberg, and whoever owns Coke. Mr. Coke or whoever.
Why Does It Matter?
This is so stupidly obvious when explained, but it creates more myths by the day. For instance, you can see one study showing that for every 100 Americans, there are 88 guns, which could lead someone to reasonably assume that it's hard to find an American who isn't packing heat. Then you see another study from the same year showing that only 43 percent of households have guns in them. It's the same deal -- the people who have tons of guns skew the average upward ... and in the process make it hard as hell to get an idea of the overall picture. More often than not, telling us the average just muddles the issue.
"Every time you enter the room, I feel enormous. Thank you, mathematical averages!"
That's why most reputable sources who try to figure out wealth distribution use the median income, not the average. The median is the actual middle point: You get there by crossing out values on either end until you get to the center, which gives you a more relatable sum of around $50,000. There, we've just made 17 percent of you feel a little better about yourselves.
4A Claim of "99 Percent Accurate" Can Be Both True and Meaningless
Sounds Like ...
You're sitting on the bed in the doctor's office, and he's got bad news. You've tested positive for some kind of cancer. You say, "Are you sure, doc?" and he goes on to inform you that the particular test they used is 99 percent accurate at detecting cancer when it's there, and it produces a false positive only 1 percent of the time in healthy people.
Holy shit, you're 99 percent doomed! Strap on your parachute and buy a monkey, it's time to start on that bucket list!
Digital Vision./Digital Vision/Getty Images
"Fuck the chute, I'm aiming for that mountain!"
The Problem Is ...
Actually, even though everything the doctor said was true, there's still only a 1 in 80 chance that you have cancer.
Wait, what? How is that possible? Because what you have come down with is instead an acute case of base-rate fallacy.
Yes, it's true that if you have cancer, the test is 99 percent accurate in telling you -- meaning out of 100 people with the disease, it only misses once. The problem is that other number; the fact that when the cancer isn't there, it still comes back positive 1 percent of the time. So in the course of trying to find the cancer, it's telling so many other people that they have it that a positive result almost becomes meaningless. How can that be true when it only gives a false positive 1 percent of the time? Because that's still a huge number.
Thinkstock Images/Comstock/Getty Images
"Boooo! Boo to science and boo to numbers!"
For instance, the number of people with, say, pancreatic cancer is actually only about 1 in 8,000. But if this test gives a "you have cancer" result in 1 percent of the cases, the doctor would tell 80 of those 8,000 people they're sick -- even though statistically, only one of them actually is. So for any one person, a "you have cancer" result only has a 1 in 80 chance of being true. It's still good reason to take more tests, but probably not enough to tell your boss off and start cooking meth.
Why Does It Matter?
A lot of technology makes promises like this that sound impressive, but only if you ignore the base rates. For example, in the U.S., the Transportation Security Administration a few years ago started talking up their new terrorist screening technology. They claimed it could catch over 99 percent of terrorists that pass through, while only identifying 0.01 percent of innocent people as al-Qaida operatives.
"The cavity search was not as fun as I had originally pictured it in my head."
Those numbers sound fantastic, but as with our cancer example, this means that a gigantic number of innocent people are getting treated as terrorists. So for instance, in 2010, about 700 million times, somebody boarded a plane that flew in the USA. If their terrorist detector "only" throws out positives 0.01 percent of the time, that means 70,000 fucking people are getting pulled out of line, accused, and searched. And in most years, statistically 0 percent of those people are terrorists. So that 0.01 percent sounds great, until you're the one they accuse of having a tiny bomb wedged in your rectum.