SymptomsSpotting stupid things in a newspaper is a bit like shooting fish in a supermarket: It's incredibly, incredibly easy. But if you've never read something critically before because of some grossly misplaced trust you have in people capable of writing, here is a short list of symptoms that can easily be found in flawed health reporting. #8. ____ Is Incredibly Good/Bad for You Here's a headline you've probably read a few times already today:
Now I'm not going to pick specific fights with specific foods or drugs; I don't want to provoke too many fistfights in the comments section, and in this particular example, I adore red wine, a key element of my morning routine. But you see this kind of thing all the time. Some study finds a link between anything and anything, and a newspaper will take that study and hold it up as proof that the universe is demanding we should be eating more kale and less fudge, even if that sounds patently ridiculous.
Seriously, fuck you, universe.
What these articles usually don't do is qualify what exactly that "link" means. If you're missing a discussion of: - the cost of a diet/treatment, - its side effects, - how it compares with other treatments and - how it compares with doing nothing at all, ... then you're not reading very useful advice. Consider that last item for a second. Does a new diet reduce the risk of getting a disease from 5.4 percent to 5.2 percent? (I don't know or care what the units could be. Riskometers, let's say.) That is a potentially interesting finding, and it could certainly serve as a guide for future research, but it's not the kind of evidence that should inspire you to cram fish oil into your throat until you stop blinking. _______________________________________________________________________ #7. Conflicts of Interest Scientists have proven that, on occasion, human beings have been known to act unethically. (Whether we can trust those scientists remains unknown.) That means it's entirely possible for a scientist, or really anyone wearing a lab coat, to have some sort of financial stake in the advice they're giving. And a lot of articles just don't look very critically at the financial backing behind these new studies. This is important to know. I'm not going to say that all egg-related advice from the Egg Council of America is invalid, but I would be a little wary if they started advising a three-egg-a-day shampooing regimen. Maybe a reporter would pick up on something so blatant, but what if the Egg Council of America changed its name to the Doctors for Healthy Health? Is a reporter going to catch that? Why would he? Just look at the name! Doctors for Healthy Health. Who wouldn't trust those guys? Those, shiny, shiny-haired guys? _______________________________________________________________________ #6. Thin Evidence Another warning sign to look for are articles that cite a single source, whether it's a single study or a single expert or even a single married expert. Single studies can throw up all sorts of misleading or contradictory evidence; it's very common for weak, circumstantial or cherry-picked evidence to be passed along as "the truth" in such cases. If you're not seeing any discussion of the methodology of a study, you'll have no idea if it had a small sample size, or a control group, or even whether it was based on animal and not human tests.
"It seems monkeys hate being hit by paintballs.""Fascinating. But we're still years away from the FDA permitting human trials.""We'd better do the monkeys again then. Tighten up those error bars a bit."
Just to show you an example of advice you can trust, let's look at another field where there isn't just a single study pointing at something, where the literature is comprehensive and the science conclusive: ear science. "Don't put that in your ear!" is almost always a correct thing to say. At any moment during human history, saying "Don't put that in your ear!" is accurate advice and can't get you in too much trouble.
Maybe not all the time.
CausesAlthough Cracked isn't a "proper" newspaper, due to the general decline in quality, respect and readership of those "proper" newspapers, Cracked has somehow become the foremost respected media source available today, right up there with Yahoo! Answers. This means that my experience, however limited and possibly fictional, permits me to speculate about why mainstream health reporting is so terrible. #5. Lack of Knowledge Earlier I talked about choosing proper methodologies, and cost/benefit analysis, and literature review. These are just some of the terribly boring chores that are a sad but necessary part of being an expert in a field. That's why we can't expect reporters to be experts in all the areas they're covering; it's often the case these days that a new reporter who a month ago was writing the "Tides Update" on page G18 is promoted to covering something actually useful, even if he doesn't know the first thing about it. And the easiest way for this reporter to write about something he doesn't know is to find and cite an expert. It's a safe play for our young reporter; by simply describing the expert's credentials and then relaying what they're saying, the reporter's young, soft ass is covered. It's the expert telling the story, not the reporter. So it's up to the audience to figure out if the expert is: - actually an expert. - a hired stooge. - two children atop each other's shoulders in a long coat. And even if our young reporter finds an actual, respectable expert who isn't trying to do anything sneaky, he can still mess up if he happens to know not a single goddamned thing about how science works. For example: There are big differences between "a study" and "a good study" and "a published study" and "a study that's been independently confirmed" and "a study that's been independently confirmed a dozen times over." These differences are important; when a scientist says something, it's not the same as the Pope saying it. It's only when dozens and hundreds of scientists start saying the same thing that we should start telling people to guzzle red wine out of a fire hose.
"Please never do that."
But, as discussed, these subtleties aren't really obvious, and if they are, they're really really boring. Which is why, in reality, the first time a single lab rat parties so hard it forgets it has cancer, the newspapers dash off another batch of red wine stories.