The Problem Is ...
The popular use of the term "average" is way different from the mathematical term, but they get used interchangeably. That's why we're so often shocked at how the "average" person is richer/fatter/taller than us. In everyday language, we use the word "average" to mean "most people," or the most representative person (as in, "The average person doesn't read classic literature" or "The average Joe can't afford to dress like Prince"). But then when they start using the word "average" to talk about statistics, you get weird results, like the fact that 67 percent of people in the USA make less than the "average" income. So how the f**k can "average" mean "most people" when most people aren't average?
Well, we all learned in school how to calculate an average: You take all the values you're averaging, add them up, and divide them by the number of values. This is fine if what you're trying to average is pretty uniform -- the average of 1, 2, 3, 4, and 5 is 3, right there in the middle. The problem is that averages are absolutely useless if a minority of numbers are unusually high -- the average of 1, 2, 3, 4, and 40 is 10, which doesn't help anybody know s**t about anything.
"Unfair? That averages out to be 12 each!"
And that's the problem with the "average income" statistic -- a few rich people are skewing the s**t out of the number. If you're earning less than the average income, it's not because your job is screwing you, it's because you live in the same country as Bill Gates, Mark Zuckerberg, and whoever owns Coke. Mr. Coke or whoever.