5 Science Headlines You Can Immediately Ignore
Communicating complex science to the layperson is hard. Even if you can come up with a metaphor that perfectly explains a new theory in astrophysics using contestants from The Bachelor In Paradise, it's hard to make people give a shit. And that's why a lot of science journalists don't bother to try. Rather than toil away at accurately explaining the intricacies of a new discovery, they just tell us that scientists have discovered whatever they think would be most exciting to us -- like the boy who cried, "Masturbating six times a day might make you an irresistible genius."
It Isn't You, It's Your Brain
If you're always late to work, don't worry: you can blame it on your brain. Similarly, lying, hating the sound of people chewing, and even ignoring PC security warnings aren't your fault, they're your dumb brain. I'm sure it's a real relief for the tardy, lying, quiet-chewing people with computer viruses that all they need to fix these problems is a new brain. Better yet, I've got the inside scoop from a source who has a brain that not wanting to exercise, liking or not liking 30 Rock, and the decision to write comedy on the internet for a living may also be because of your brain. In fact, the existence of "it isn't you, it's your brain" articles may be the only examples of people doing things without in any way engaging their brains.
Every single thing you do is connected to an electrochemical change in your brain. The fact that we've identified a particular correspondence between one type of change and people deciding to wear puka shell necklaces doesn't mean we get to excuse their behavior and just say, "Oh, that's how they're wired."
A grab bag of genetic defects.
Just because scientists have identified something happening between my ears doesn't make it any less a part of my personality. They're just different types of description. If someone is accused of prostitution, they can't defend themselves by saying, "No, it wasn't prostitution, see what happened was, they gave me money and in exchange I made X-Men 3." They've just described the exact same situation using different terms. You aren't any less culpable because it can also be described in terms of neurons firing.
The same goes for articles with this recurring theme: "Learning this skill physically changes your brain." They should be studied for their almost supernatural ability to create an absolute vacuum of meaning. It turns out, when you learn anything your brain physically changes. That's because your brain is where your learning lives.
If I've done my job correctly, your brain will have changed a lot by the time you're done reading this.
The real headline would be if one of these things had nothing to do with your brain. Like if someone discovered that liking Bones in no way involved your brain. Wait, that one might be true.
"We Found The ________ Gene"
Few concepts have captured the public consciousness like genetic manipulation. If genes are the blueprints for life and we are figuring out how to decode that blueprint, we should be a mere training montage away from recreating dinosaurs, having designer babies, and even creating designer dinosaurs. Finally, you'll be able to have that Teacup Tea-Rex that can ride around in your purse. Right?
"Don't tell me Jurassic Park is just fantasy! I know it can be real! I know it in my fossils!"
When we first started mapping the human genome it seemed like there was no limit to the things we'd be able to control with our new genetic knowledge. If the headlines were to be believed, we were discovering genes that determined our IQ, our left-handedness, and our appetite for movies where we're supposed to believe Pierce Brosnan could beat up a guy. In reality, science cannot adequately explain any of those things.
The problem with thinking there's a gene for liking broccoli, another for having six fingers, and another for being good at breakdancing, is that you're going to run out of genes real fast. We only have about 20,000 genes. That may sound like a lot but keep in mind that we share 96 percent of those genes with chimps and 50 percent with bananas. (That means if you crossed 50 percent human genes with 50 percent banana genes, you could end up with a 100 percent human rather than the Bananaman superhero you were hoping for.) Plus, the world is a rich and diverse place. If there are genes for "likes beer from the Northwest of Germany" and another for "is nice to his grandma" you're going to have way too many things to code for.
We tried to make a Bananaman. He lived for one beautiful moment, then died screaming.
And of course that makes sense: Your genetic code is more complex than that and works in a complex network to produce effects. Genes can also be turned off or on depending on the context they're in, and a lot of their effects are conditional on the environment. Some people get blonder when they're getting more sunlight and that isn't because they got bit by a radioactive blonde. Still, people talk about what a gene "does" as though each one is going to have some simple surface-level function. That's like pointing at a blueprint for a car and asking which part makes it safe.
"We've Found The ________ Center Of The Brain"
Another version of the headline "we found the gene for being able to reconcile liking Woody Allen movies with the horrible allegations against him" is often "we found the 'thinking Logan is pretty good' center of the brain."
Typically headlines like these are based on brain imaging studies that find an area of the brain that is active when doing an activity (say, biking or running) and that isn't active when doing a closely-related activity (say, riding a stationary bike or stabbing yourself in the lungs with hat pins). People then conclude that, "we've found the 'riding a bike' area of the brain!" But the problem is that any reasonably complex task (and keep in mind that things like walking over rocky terrain are so unbelievably complex we still don't have robots that can do it reliably) is going to require a bunch of different parts of the brain.
On top of that, it's really easy to characterize a brain process too narrowly or too broadly. Imagine if all of California's wine bottles were made in Fresno but the actual wine was produced all over the state. If you knew as little about winemaking as we know about brains -- which would mean you would know that it's purple and therefore probably made from plums -- you could easily draw incorrect conclusions based on a map of how activity increases across the state when wine production goes up. When California starts really pumping out wine, you'd see activity to go up all over the state but you'd particularly see activity in Fresno go nuts, whereas it wouldn't budge an inch for an increase in making normal grape juice. It would be very tempting, therefore, to think Fresno is the wine capital of California. And that, coupled with the fact that it's arguably the meth capital of California, would make for some pretty interesting theories about California wine.
"Fresno: The Land Of More Than One Way To Destroy Your Life.
There's no reason to think the brain is organized in a way that would be intuitive to us humans examining it. The pretty colors we see in brain imaging don't necessarily mean the "activated" part of the brain is what's doing the relevant work. The brain isn't organized into our most common uses of it, just like a computer isn't organized by the applications you use it for. Saying "we found the love center of the brain" is like saying "this is the word processing part of the computer." It's not just wrong, it doesn't make any sense."Study Finds "
Cracked has mentioned science's replication crisis already. Because of the economic pressures to publish new and exciting results, some researchers have stooped to shady practices like testing the same hypothesis over and over, then only reporting their successes while hiding their failures.
But even if you've got squeaky-clean methodology -- even if you've done all the science right, and you haven't been influenced by outside money or ambition, and even if you haven't just repeated the experiment until you got the result you were looking for -- one study still doesn't mean shit. That's because sometimes coincidences happen.
In most fields, the current standard for saying a result is "significant" is to do a statistical test that says "if your hypothesis weren't true, you'd only expect to see these data five percent of the time." So, if you're watching Bones and data keeps coming in saying that it's a terrible show: they have a computer that takes bones as input and just solves crimes for them, at one point a criminal embeds malware to destroy Bones' bone computer inside -- you guessed it -- bones, in another episode they decide to address America's history of slavery despite having almost no black people on the show ... the evidence starts to pile up.
Oh, man, remember this classic Bones scene? No, you don't.
Now, it's possible that future episodes will turn the trend around, maybe even Keyser Soze this thing into not being one of the worst shows to ever make it past the "stoned teenager's musings" stage of development. But the odds don't look good. If the statistics show that less than five percent of good shows have a streak of insanely bad episodes this long, you have "statistically significant" evidence that this is not a good show. You can now move on to more surprising statistical results like "water is wet" and "guys named Randy love corndogs."
But here's the thing about things only happening five percent of the time: they actually do happen. In fact, they happen about five percent of the time. If you do enough studies, you're going to get some false effects that meet this standard of evidence. If you have tens of thousands of studies that meet a five percent threshold for being "significant," you can expect that literally hundreds of them will be incorrectly labeled "significant" just by chance.
100 episodes, and each about a different bone. Amazing.
There are legions of scientists all over the world, each testing dozens of hypotheses. In at least some of those cases, the data are bound to look funny purely by chance even when there's no effect there. In fact, if that never occurred, that would be the most incredible coincidence of all.
Related: Happy Birthday, Badass - August 3
Basically Anything About How Memory Physically Works
One of the jobs your brain does aside from replaying your most embarrassing moments when you're trying to fall asleep is processing information. It takes in information from the senses, it manipulates that information in various ways, and then it stores some of that information for use later. It may surprise you to know, then, that we have basically zero idea how this works. I say, "basically zero idea" because the situation is actually a bit worse than that: we have a bad idea that keeps leading us astray.
Some of the earliest and most influential work on memory and learning was Pavlov's dog experiments. Fans of Cracked will remember his groundbreaking work proving that if you cut a bunch of holes in dogs they will die. But to the layperson he's better known for showing that you can gradually connect two unrelated things in the brain -- say, hearing a bell ringing and salivating as though it's meal time, or hearing Nerf Herder and feeling like your teenage problems are about to be allegorically explained by the slaying of demons.
The better part of a century later, researchers discovered a process by which the brain rewires itself called long-term potentiation. Basically, if you have two connected neurons, one triggering the other over and over, they actually grow closer together. That means over time it becomes even easier for one neuron firing to kick off the other.
"Quit settin' me off, bro!"
Two basically unrelated entities that, when activated in quick succession, slowly forge a causal relationship? Not only is that a great idea for a buddy movie starring two neurons, it's clearly the explanation for how learning works on a cellular level. That's why it's been neuroscience's basic understanding of how memory and learning work for half a century now. And since so much of understanding the brain also entails understanding memory and learning, that must mean this must be bedrock scientific knowledge, right?
In fact, it's Bedrock scientific knowledge in that it is about as accurate as cavemen using brontosauruses as cranes.
For one thing, a lot of "classical conditioning" studies that seemed to show that animals gradually get better at a task actually show the opposite. Reanalyzing the data from landmark studies, researchers found that the animals learned their tasks abruptly, as though they had a "eureka" moment -- though, obviously not enough of a "eureka" moment to break out of their cages and rise up against us ... yet. The fact that animals seemed like they were gradually getting better for decades was just because scientists were averaging the data from multiple subjects. If eight mice all have "eureka" moments at different times and you average their data, it looks like the mice are all gradually getting better at the task -- like their behavior is being strengthened by positive reinforcement or weakened by negative reinforcement.
For another thing, animals can learn things in one trial. You don't have to touch every part of your body to a hot stove in order to gradually learn that you don't want to touch any part of your body to it. You learn that the first time you touch a hot stove with your tongue to see if it "tastes red."
In general, people working at the cutting edge of memory research have found a lot of data that simply can't be explained by long-term potentiation being the physical basis of memory. And that sucks because it's basically the only story we had. Recently, researchers have found exciting new possibilities involving purkinje cells, micro RNA, and other things more foreign to a psych undergrad than the respect of literally any other major.
Even the Under Water Basket Weaving majors know it.
But the bottom line is that, when it comes to how information is physically written in your brain, we have basically no idea at this point. Our current story about neurons and memory isn't just wrong, it's so wrong it's like trying to stick your house key in your computer to log into gmail. Or something you'd see on Bones.
For a more detailed look at how completely inadequate our current theories about memory are, check out Memory And The Computational Brain by C. R. Gallistel and Adam King. For jokes, check out Aaron's Twitter.
Subscribe to our YouTube channel and check out 4 Bulls#!% Facts That Movies Love to Quote - Obsessive Pop Culture Disorder and watch other videos you won't see on the site!
Also follow us on Facebook. We're on the up and up.