4 Medical Myths We Got Thanks to Bad Reading Comprehension
You don’t need to spend your nights carrying out experiments or poring over data. That’s the job of scientists. Then scientists summarize their findings in papers. Then the media summarize those papers in articles. Then other outlets summarize those articles with their own articles. You might not even read those articles, but luckily, those are summarized as headlines.
It’s a fine system, when it works. But if any part of this chain screws up, we all end up believing lies.
MSG Causing ‘Chinese Restaurant Syndrome’ Was a Joke
For decades, monosodium glutamate (MSG) has had a reputation for messing with people’s health. Some say it gives you headaches, while other people imagine various more serious long-term effects. In reality, MSG has the same effects on your body as salt does — except, it takes more MSG than salt to get that same effect.
Don't Miss
So, it’s safe, and it’s probably safer than a whole lot of other stuff you eat. To get the stuff causing any noticeable symptoms, you’d have to gobble it up without any food, which is something no one normally does. And even then, most people would feel nothing, and any symptoms you do feel aren’t dangerous and will go away on their own.
The idea that MSG may be bad originated with one letter written to the New England Journal of Medicine in 1968 by a doctor named Robert Ho Man Kwok. The letter talked about how every time the guy and his friends went to a Chinese restaurant in America, they’d feel numb for a couple hours. Maybe it was the MSG, they speculated. Or maybe it was the salt. Or maybe it was the wine, he suggested, since he was describing something quite similar to drunkenness. Anyway, he hoped someone would study the matter, since he himself wasn’t equipped to.
The Journal assigned this letter a title: “Chinese-Restaurant Syndrome.” That is not a serious title, and it reflected the level of seriousness they expected readers to devote to this anecdotal observation. Readers responded appropriately, by not taking it seriously at all. Instead, they wrote their own letters, describing ridiculous symptoms from eating Chinese food, all delivered with deadpan verbose medical terminology. One reader complained of “lacrimation, peri-orbital fasciculation,” which means crying and eye-twitching. Another complained of “facial and cervical flushing.”
One reader wrote demanding the original letter’s true identity, reasoning that Robert Ho Man Kwok had to be his troll name. It sounded like “Human Crock,” didn’t it? In reality, that was the guy’s actual name, but it did look fake, and years later, someone would falsely claim to be the true author of the letter and to really have chosen the name to make a “human crock of you-know-what” joke.
Other letters got more creative with their responses. One student sent in a limerick:
My thanks to this great periodical
For its studies on food so methodical
Now my clams are full steamed,
And my Chinese food screened.
And my appetite, oh well, much less prodigal
Another student one-upped that with a more elaborate poem:
Mourn, Sweet and Sour, your lost charisma
Midst painful jaw and flushed platysma
Of etiology once inscrutable
Your syndrome now is irrefutable
(Not mushrooms, nor tetrodoxin —
No more than bagels with their lox in.)
Great havoc does your whim create
With excess sodium glutamate
Your gustation’s ginger-peachy
Though less digestible than the lichee
What allergen — some vile miasma?
I’d sooner you than bronchial asthma
The comment section was having a ton of fun. Then the mainstream media caught word of this. The New York Times published an article called “‘Chinese Restaurant Syndrome' Puzzles Doctors,” reporting on the phenomenon with full earnestness. Soon, newly informed members of the public reported feeling headaches and tingling after eating Chinese food. They didn’t in clinical studies, but they did in the wild, and they blamed MSG. It’s possible that many of these people were simply drunk.
Some of you might feel baffled or even outraged that a scientific publication like the New England Journal of Medicine should act as a forum for comedy. But its status as such a serious medical authority is precisely why its Correspondence section became so full of jokes.
Even when intended as legit, someone writing about an anecdote or casual observation isn’t real medicine, according to the Journal. Real medicine needs real studies, studies that the Journal publishes. If it’s not a proper study, if it's just the letters section, it’s just people shootin’ the shit. And that’s fine, so long as some outsider with no understanding of this convention doesn’t stroll in and take it seriously.
Medical Errors Aren’t a Leading Cause of Death
“Medical Errors Are No. 3 Cause of U.S Deaths” read headlines in places like NPR and the Washington Post. Number one is heart disease, cancer ranks a close second and medical mistakes are number three. These stories were responding to recent Johns Hopkins data published in the British Medical Journal. Actually, they were responding to a letter med students wrote to the CDC about that data, but either way, if hospital mistakes were really responsible for 10 percent of all deaths in America, that’s a big deal.
Fox
The British Medical Journal soon published a follow-up piece that said, “Hey, that thing we put out from JHU on medical errors? It wasn’t a study, and you shouldn’t treat it as one.” It had really been just a call for better reporting of medical errors because death certificates currently don’t mention if any error was involved. To come up with stats, the JHU analysis averaged results from a few previous studies, and it averaged them badly. One was a study of 12 deaths, while a second was of 14. You shouldn’t use tiny numbers like that to extrapolate a total of 400,000 deaths nationwide.
The third study they factored in looked bigger — they claimed it covered 180,000 Medicare deaths — but it really just tracked 780 patients, and 12 problem deaths, then estimated how many deaths that meant for the entire Medicare program. This study was exclusively on Medicare patients that had been hospitalized for 24 hours or more. It concluded that 1.5 percent experienced some “adverse event” that contributed to their deaths. The JHU analysis multiplied that percentage by the total number of hospital admissions in the U.S. to calculate how many die from hospital errors. That’s nuts. Not everyone who walks into a hospital has the same chance of dying as a senior citizen who’s hospitalized overnight or longer.
Plus, the Medicare study classified only a minority of those adverse events as “preventable.” They were issues the patients encountered in the hospital, but if they weren’t preventable, do they really count as mistakes? And the JHU analysis also counted as errors such minor hospital-acquired conditions as bedsores. Those can get bad, but they’re generally not lethal, so it looks like the full roundup included all deaths following errors rather than just deaths caused by errors. If someone dies of cancer but a chart misspelled their name, maybe we shouldn’t count that as a death due to medical error.
The JHU analysis said 62 percent of all hospital deaths may be from medical errors. Other metanalyses come up a number more like 4 percent. Which is still a lot, but that’s a big difference. It’s the difference between “let’s go to the hospital and get that gaping wound looked at” and “actually, with all the mistakes they do there, we’re better off just staying home and hoping for the best.”
Men Aren’t More Likely to Abandon a Sick Spouse
Death isn’t the only possible consequence when you get sick, however. There may be another D for you to worry about — divorce. A study from 2015 determined that couples are more likely to split after the wife gets seriously ill. If the husband became seriously ill, no increased chance of divorce appeared, so lots of news sites commented on this provable one-sided trend of men abandoning their sick spouses.
The sociological study looked at 2,701 marriages, whose progress was tracked by the University of Michigan’s Health and Retirement Study across 20 years. But when some other researchers tried doing their own number crunching on those same data, they didn’t find the same results at all. The original researchers now gave their work another look, and they saw what the problem was.
Every time a couple stopped responding to the survey? They should have counted that as the couple dropping out. Instead, they counted that as the couple divorcing, and that screwed up their results. We actually have the line of faulty code that produced the mistake:
The researchers now did the right thing and issued a formal retraction. “We conclude that there are not gender differences in the relationship between gender, pooled illness onset, and divorce,” they now said, and you’ll today find only the corrected second paper they published, not the original. Some news sites even reported on the retraction. But those articles we linked to before are still up, with no corrections. And years after the retraction, other news sources were still citing the study, to prove there’s an epidemic of men leaving wives who are hooked up to too many tubes.
Left-Handers Don’t Die Younger Than Right-Handers
Left-handers live every day in a world that wasn’t designed for them. They must use right-handed can openers and right-handed nunchucks, and this sometimes goes beyond inconvenience into outright danger. Many articles that round up all the indignities lefties suffer mention a stat from a 1991 study, which says left-handed people on average die nine years younger than right-handers. Wow, that’s quite a statistic!
Nine years? Nine years? That would make handedness one of the biggest predictors of death ever, short of actual terminal illness. Even smokers don’t die nine years sooner than non-smokers on average. How much time does a typical person spend using tools of any kind, for the handedness of tools to change everyone’s life expectancy that much?
When they came out with the study, the psychologists who ran it pointed to an earlier study they’d done, which said left-handers were more likely to get into accidents and were almost twice as likely to get injured in a car accident. Car accidents could have an effect on life expectancy, since many people do die in cars. But left-handers are twice as at risk for car accidents — really? Then, when we dig up the study, we see that it was based on just 19 car accidents by left-handed people and tried extending that rate to the population at large. We were just talking about this: You can’t use data that small to extrapolate anything.
As for the nine-year study, the psychologists didn’t follow left-handers and right-handers over the course of many years and see when each person died. They looked at the death records of around a thousand Californians and calculated that the lefties died at a mean age of 66, while the right-handers died at a mean age of 75. Again, the sample size was too small, as they were dealing with just around 70 lefties and perhaps just four lefty car accidents. But there was also a larger problem.
They found that people who lived into old age were more likely to be right-handed, compared with people who died young. But it’s also true that people who lived into old age in 1989 were more likely to be right-handed than people who were alive and young in 1989. That’s because, a generation or two earlier, kids were commonly brought up as right-handed even if they were actually born left-handed. The reason few left-handers died in their 70s in 1989 wasn’t that most died younger thanks to accidents — it’s because most left-handers born in the 1910s were raised as righties and called themselves righties as adults.
To reevaluate this, scientists at Microsoft and Harvard last year ran their own analysis on two million Americans who died in 1989. They ran a simulation that assumed no actual difference in life expectancy between left- and right-handers, but simply took into account the varying levels at which left-handedness was reported by various generations. They found that this returned an apparent difference in life expectancy of nine years, just like the psychologists had. (Actually, their simulation showed an apparent difference of 9.3 years, which was slightly higher than the psychologists’ 8.97.)
Incidentally, a lot of articles on the 1991 study refer to it as a New England Journal of Medicine study. It wasn’t really. But the Journal did publish a letter from the psychologists in their Correspondence section.
Follow Ryan Menezes on Twitter for more stuff no one should see.