4 Dumb Thought Experiments That Need to Be Eradicated From the Hivemind

By:
4 Dumb Thought Experiments That Need to Be Eradicated From the Hivemind

Even though the thought experiment in theory is supposed to be a pure philosophical discussion — a conundrum cooked up to change your viewpoint on an issue in a compelling way — it’s lost a little bit of that shine, especially thanks to the internet. Nowadays, hearing the words “let’s try a thought experiment” usually just makes your brain steel up in preparation for hearing some deeply dumb contrarian bullshit. If you’re scrolling through Twitter and see someone use the words “thought experiment” or “devil’s advocate,” I can’t urge you enough, for the betterment of your own week, to keep scrolling at high speed.

Of course, a good thought experiment is a wonderful thing to kick around the old noggin. You’ve got your classic trolley problem, which only becomes more prescient now that cars are driving themselves (terribly). The Ship of Theseus is always a fun one to pull out three beers deep until you and your friends are all yelling at each other about planks. My least favorite type of thought experiment, though, is the one that seems engineered to get a “whoa” from a stoned Joe Rogan more than inspire any actual sort of interesting thought.

In particular, here are five thought experiments that the world doesn’t need and nobody asked for…

Buridan’s Ass

Pixabay

This donkey has no idea hes part of an infuriatingly dumb theory.

The conflict, which is a generous term here, presented in Buridan’s Ass, named for French philosopher Jean Buridan, is as follows: Could God give himself such a big, bodacious ass that he could not create jeans to contain it? Just kidding, but that would be a lot more fun than the real one. The ass in question is a boring old donkey, who is starving, and standing equidistant between two bales of identical hay, each precisely as easy to reach and eat as the other.

The question is, which bale of hay does the donkey choose, or — and this is a real situation people present — would the donkey simply be torn by indecision so deeply that it starves. My first, and deepest, thought inspired by this experiment, is who gives a shit? I understand this is me not giving this “puzzle” the benefit of thought, but thought is something it has not earned. If you want to cook up a proverb, sure, I’ll take it. “Don’t be the donkey that starves choosing between two bales of hay” sounds good enough, charming and folksy. But if you expect this scenario to last, even in theory, longer than it takes for the donkey to eat both bales of hay, suddenly I need to go to the bathroom.

The Life You Can Save

Pixabay

Counterpoint: This could be a ghost.

I almost had to do a lap of my apartment just thinking about this shit. Presented in a probably equally dumb book of the same name by a guy named Peter Singer, this is some dumb half-baked gotcha shit that’s pretending to be a great human mystery. The whole thing is about being a better person, which is a pretty easy way to make people feel weird about saying that it’s very stupid. Here’s the gist of it: If you were walking in your expensive work clothes and saw a drowning child, would you jump in to save them, even if it would ruin your fancy clothes? The answer to this, unless you are a sharply dressed sociopath, is obviously yes.

But alas! By answering yes you have fallen into Singer’s great trap! Because by the same belief, how could you freely spend money the way you do while a child starves somewhere else in the world! You are undone! How does that petard feel, dummy! This is the most grandiose false equivalence bullshit I’ve ever heard, and it’s created in service of something everybody already understands anyways: Yes, it’s easy to dehumanize people when you can’t directly see them. This is, presumably, a book for people who see a Banksy of a child in a gas mask and are brought to tears.

Roko’s Basilisk

Pixabay

Now that you know this guy exists, hes going to kill you.

WARNING: BY READING THE DESCRIPTION OF THIS THOUGHT EXPERIMENT, EVEN KNOWING THAT IT EXISTS, YOU ARE UPSETTING A FUTURE ROBOT GOD! READ AT YOUR OWN RISK!

That dumb disclaimer should give you a little hint of the stench of the pile of philosophical manure you’re wading into. The best thought experiments present a complicated, ethical or philosophical problem in an easy-to-visualize format. Roko’s Basilisk does absolutely none of that. It originated from an online forum full of future-obsessed tech guys, so you know you’re in for an absolute treat. I’m going to do my absolute best to describe it in any sort of succinct manner (all the best thought experiments require dozens of specific variables, after all). I’m opening myself up to danger here, because discussing Roko’s Basilisk truly does open you up to a horrible future — that of endless red-faced futurists tracking you down and setting upon your humorous internet article with the ferocity of a condescending mental wendigo.

As brief as I can make it: Roko’s Basilisk first suggests that in the future, there is a borderline omniscient A.I. that has been brought into being, basically the outcome of “the singularity.” Once this sentience exists, it will identify those who are against its goal (sometimes suggested as “creating utopia”) as threats worthy of elimination or subjugation. This thought by the A.I., though, will then extend backwards in time to not only people who are presently trying to hamstring it, but to everyone who impeded its creation or the chain of events that made it possible. Sort of an A.I. version of a despotic ruler taking control and then jailing or executing those who opposed their rise to power. The danger suggested here is: Now that you know about the future existence of this A.I., you are officially on one side or the other, and if you don’t support and assist in its creation, you have just doomed yourself. You basically just lost The Game, where your punishment for doing so is getting chopped up by lasers or put in robot jail.

If this horrible malevolent technological God ever arises and asks me why I did not assist in its development, I will simply answer: Dude, I was mostly just eating chips.

Hamlet Monkeys

Pixabay

Smart Things for Dummies, Volume 1

As much as I love monkeys, I never need to hear this tired-ass “thought experiment” ever again. It’s one of a whole smorgasbord of hypothetical situations created to constantly try to reinforce the idea that yes, infinity really does mean infinity. It’s a delightful mental picture, but it’s wholly unnecessary. If someone doesn’t understand that infinite time creates infinite possibilities, best of luck to them. If the idea of infinite timelines, something that’s a core part of a blockbuster Avengers movie, is too powerful for their brain, I think it’s time to let them continue trying to successfully complete a single crossword puzzle.

Scroll down for the next article

MUST READ

Forgot Password?