The good news is that the Internet has given us greater access to extended family, news from remote parts of the globe and pictures of exotic genitals we would have never been able to see in the real world. The bad news is that the Internet is also pitting neighbor against neighbor in new and innovative ways that only technology could have made possible.
The worse news? It's not getting better any time soon, thanks to ...
5 New Algorithms That Make Sure You Only Talk to People You Agree With
The only reason you know anything about how to be a good human being is because other people told you when you screwed up. It's not pleasant being told you smell or that your jokes aren't funny or that your scrotum has fallen out of your pants, but it's also the only way you know to start showering or learn funnier jokes or move to a more open-minded neighborhood. You already knew this -- we all can think of rich people and celebrities who are surrounded by "yes men" who never give them honest feedback and who get so disconnected that they basically go crazy (see: Michael Jackson, George Lucas).
Yet the Internet is building that bubble of "yes men" around you, right now, and you don't even notice. For instance, everyone's favorite social networking site, Facebook, is filtering your friends according to how much you agree with them. It's not some crazy conspiracy theory, it's a computer algorithm. Here:
There's a downside to kids paying too much attention in math class.
What, did you think that Facebook just gave you all of your friends' updates in order? Nope -- not unless you tell it to. By default, it filters them according to your preferences, and it knows your preferences because it keeps track of all of the links you click on. If you click on a lot of left-wing news stories, it will start filtering out your right-wing friends.
Tech expert Eli Pariser calls these algorithms the "filter bubble," and its implications are pretty sinister. You've already seen this if you are a younger person continually embarrassed/frustrated by the idiotic Facebook hoaxes your older family members fall for. No, Uncle Frank, Obama did not ban the use of the phrase "Christmas tree" and didn't paint over an American flag with his own logo. How can he not recognize these as silly urban legends? Because everyone who would tell him so has been filtered out. Bad information can circulate forever in a bubble where everybody agrees with it.
"Standing in this room right now, I can just feel that 2012 is the mullet's year!"
After all, the same kid annoyed with Uncle Frank might in the next moment give a knee-jerk "Like" to a fake quote from Rick Santorum or Sarah Palin. And Facebook remembers, so the next link to come along that's popular with your political faction will get promoted right to the top. Your life becomes an endless stream of links telling you that everything you already believe is right, and there is no reason to ever question it. Your computer might as well have a mechanical arm that comes out and continually pats you on the back for being so awesome.
But you can still comment on other people's links, right, and set them straight? Not so fast -- users have reported seeing this after trying to comment on a friend's status:
"I really don't see how my balls aren't relevant to this!"
The comment in question wasn't inviting anyone to an underground Nazi get-together. But it did set off a spam filter, apparently because it was long and included three links. In other words, if you find yourself in a Facebook debate, the one thing that can get you filtered as spam is daring to give too much explanation or sources to back up what you're saying. It's probably best to just call everyone Nazis/communists and go on about your day.
4 New Methods to Make Misinformation Spread Faster
"Well, sure," you might say, "That's why I don't get my freaking news from Facebook, dumbass!" Good for you! And you don't want to depend on just one source, because every site has a bias of some kind. No, the savvy consumer of news and opinion will let Internet democracy tell them what to read -- you go to a site where the users submit the links, like MetaFilter or Digg or Reddit. So, just a month ago, the front page of Digg had this amazing story about a crazy female dentist who got angry at her boyfriend and pulled all of his teeth:
Hell hath no fury like a woman with access to pliers and laughing gas scorned.
This is what the Internet is good for, right here -- a great, weird story to tell around the water cooler tomorrow, especially if your co-workers are male ("Women sure are CRAZY, right, boys?"). Too bad it's a hoax; the source for that submission is listed as HuffingtonPost.com, but they were simply sharing a link they themselves found. Where? Why, the Daily Mail -- a bullshit U.K. tabloid that is prone to making amusing stories up out of thin air to get clicks. As usual, the story turned out to be complete fiction. But not before it was shared on every news portal (and tens of thousands of times on Facebook).
Here's an even scarier story from the front page of Reddit: Russia is massing troops on Iran's northern border, ready to start World War III if the U.S. attacks Iran:
If they take over Iran, they get seven bonus troops next turn.
And the source is BusinessInsider.com. Well, that sounds respectable as shit, that's like the Wall Street Journal or something, isn't it? Actually, that site pulled the article from crazy conspiracy site World Net Daily (if you go there at any given moment, you'll likely see a front page headline about how Obama's birth certificate has definitely been proven fake this time).
Come on, "Honolulu" doesn't even sound real.
The story cites unnamed sources. So if you follow the path, it goes:
Front Page of Huge News Portal -> Popular Website -> Crazy Conspiracy Site -> Imagination of Random Unnamed Guy Who Didn't Get Enough Attention as a Child. Did we mention that Russia doesn't border Iran?
But very few people followed that path, because the Internet is physically changing the way we read.
Instead of reclining on the front porch to leisurely peruse the 11,282-word sentence at the end of Ulysses (you know ... like we used to), we're browsing headlines, or skimming through our RSS feed or Tumblr. The brain is like a muscle -- it gets good at whatever you spend the most time doing, and what we spend all of our time doing is skimming. Whether we mean to or not, we're training our brain to have a shitty attention span -- and we mean it's actually changing the shape of our brain, building up our ability to skim an ocean of facts and decreasing our ability to actually stop and dig into the details.
"It's about a lobotomized boy who gets hit by a train."
And that means we are getting really, really bad at sniffing out bad or false information.
It's awesome that anybody with a keyboard can get published online -- that's what the Internet is all about, after all. But it's not so great when we lose the ability to figure out whether the story about Kim Jong-un being assassinated came from a BBC reporter at the scene or a 14-year-old 4chan poster reporting from his bedroom. The legit news sites, the tabloid news sites, the great blogs, the shitty blogs ... all of it gets swirled together in the pot, and what floats to the top isn't the stuff that's true, but the stuff that is first with the story (even if it's wrong) or the most inflammatory (and not surprisingly, user-generated content tends to be more biased than professional content).
"This just in from our news affiliate dongtacular.org ..."
And that leads to another problem ...