4 Facebook Scandals That Need To Be In A 'Social Network' Sequel
It's crazy that The Social Network came out in 2010, when the shadiest thing Facebook and 26-year-old Mark Zuckerberg had done was maybe screwing over some pals and crashing Harvard's servers via sexism. We had no idea. If there's ever a sequel (which both the screenwriter and the main star have expressed interest in), it would be like Aliens to The Social Network's Alien, only with society-destroying business decisions instead of ribcage-destroying monsters. Here are more moments from Facebook's post-2010 history that we can't help imagining being scored by Trent Reznor …
The "Stalking Users On Other Sites" Controversy
We first talked about this issue 11 freaking years ago and the court case wrapped up just this February, so a Social Network sequel could use it as an excuse for a fun time-passing montage (picture Mark Zuckerberg in court dancing Gangnam Style, Mark Zuckerberg in court with a face mask on, etc.). In 2011, Facebook was accused of tracking users online even when they weren't logged in -- or when they couldn't log in, because they were doing this to people who had never registered on Facebook, too. The accusers claimed that Facebook used "like" buttons on other websites to collect data on you and add it to your shadow profile– that is, all the data they have on you that you didn't intentionally give them in exchange for two likes and an unrelated comment from your aunt.
At the time, Facebook conceded that they were tracking logged-in users through other websites because they had their permission (as we all know from reading all 18,000 or so words in their terms and conditions every time they're updated), but insisted that logged-out users were safe from this snooping. They said these accusations were "without merit" because Facebook respects your privacy and totally doesn't see you as an endless well of data they can sell to advertisers. Cut to 2022:
The case had actually been thrown out in 2017 by a judge who claimed users couldn't prove they had a "reasonable expectation of privacy" because just being on a website with Facebook's logo anywhere in it must mean you're cool with prostituting your data. Three years later, a federal appeals court disagreed and revived the case, citing wiretapping laws. When the Supreme Court declined to come to Facebook's rescue, the company sighed and agreed to pay $90 million to the plaintiffs, which is one of the largest data privacy settlements in U.S. history ... and also chump change to them. This was way less costly for them than the time they were caught collecting digital scans of users' faces without their consent as part of their photo-tagging features and had to cough up $650 million. And in terms of social impact, it was like a schoolboy prank compared to ...
The Whole "Ripping Society Apart" Thing
In 2017, former senior executive Chamath Palihapitiya claimed that Facebook was "ripping apart the social fabric of how society works," which already sounds like something an Aaron Sorkin-written character would say. It's like reality is already writing the sequel for him. Palihapitiya was vice president for user growth until 2011 and helped the company reach one billion users, which he claims to "deeply regret" now since he believes the "short-term, dopamine-driven feedback loops we've created are destroying how society works." That sounds pretty dramatic for a social media site -- no one ever accused MySpace or Friendster of leading us into the world of Mad Max. And yet, if other former Facebook employees are to be believed, Palihapitiya is underselling the damage.
There's Sophie Zhang, a data scientist who says Facebook willfully turned a blind eye to authoritarian governments using networks of fake accounts to manipulate their own citizens. Or Frances Haugen, another data scientist and product manager, who thinks Facebook had a role in causing the U.S. Capitol attack, among other dumbass developments in recent history. She also said: "The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money." (She's no longer at the company if that wasn't clear.)
Another former employee, Mark S. Luckie, is of the opinion that Facebook could prevent horrifying bloodshed in places like Myanmar and Sri Lanka by removing hate speech and misinformation -- or, you know, at the very least, not amplifying that crap to whip up user engagement. According to Luckie, Facebook lets stuff like that fly by to maintain good relations with some governments and political parties. And then there's Christopher Wylie, a whistleblower who never worked at Facebook yet still claimed he had access to data for 50 million Facebook users through his employer, the shady-ass Cambridge Analytica. Facebook corrected him: it was 87 million, thank you very much. You probably know how that mess turned out, but if you didn't, here's a short video about it:
And speaking of abruptly directing users to videos ...
The "Destroying The Internet's Economy" Debacle
Picture this: Jesse Eisenberg talks to his staff in a virtual meeting and tells them there's no reason to freak out after Facebook/Meta lost $251 billion in value in a single day while also explaining that he's not crying; he just tore his cornea (from crying too much, presumably). This actually happened earlier this year after a disastrous financial report which caused the company to lose a quarter of its value, but the most ridiculous part is what came after. While trying to reassure his employees that everything's fine, Zuckerberg told them his plan to get the company back into shape: pivoting to video.
Does he ... not remember what happened the last time they tried that? Because we do. Oh, we do.
Remember when all your favorite websites suddenly started spamming you with videos around the mid-2010s? That was largely because Facebook metrics showed them that video content was suddenly doing incredibly well -- certainly way better than lame-ass 1,500-ish word articles. Websites reacted by investing majorly in video content, but then, whoopsie, it turned out that Facebook had "overestimated" (inflated) those numbers by as much as 900%. And they didn't tell anyone for over a year. Hence the aforementioned websites dying off or having to fire most of their staff during that period. Now let us pause the article while we stare into the distance for a few minutes.
Not only was the effect on the online media devastating -- this didn't even work for Facebook. According to The Wall Street Journal's big deep dive into Facebook's internal documents, the company's heavy-handed efforts to push everyone to professionally produced video content contributed to the fact that no one seems to give a crap about your heartfelt eight-paragraph post about your Nana dying anymore. So it's ironic that Zuckerberg wants to try that strategy again to compete with TikTok -- an approach that's already backfiring, as demonstrated by the fact that Instagram's TikTok ripoff features seem to be crashing their overall engagement. In short: we foresee lots of "scratched corneas" in the future for Mark.
But hey, if he suddenly finds himself unemployed, he can always turn to his other passion: pretending to care for civic service.
The "Mark Zuckerberg Pretends He Doesn't Want To Be President" Farce
Any Social Network sequel would necessarily have to turn from a tense drama into an awkward comedy upon reaching the part where Zuckerberg sees the results of the 2016 presidential election and says, "Huh, any asshole can be president ... Wait a minute, any asshole can be president!" Okay, we're speculating on his exact words there, but frankly, we shouldn't even have to because his actual presidential non-campaign was ridiculous enough.
People started getting the impression Zuck was considering a presidential run in December 2016 when he suddenly announced that he was no longer an atheist because "now I believe religion is very important." While he and everyone at Facebook insisted that he totally wasn't getting into politics, that didn't sound very convincing once he started going around the country shaking hands, meeting regular folks, and presumably kissing babies if anyone had volunteered them.
Of course, Facebook's various scandals ended up crushing any political ambitions Zuckerberg might have had, but they still weren't as damaging as the fact that he's Mark Zuckerberg. The more he tried to look relatable and like someone who actually cares about people, the more the memes about him being a robot went viral. His fatal miscalculation was not foreseeing the fact that increasing his exposure just let more people see that he's profoundly awkward and off-putting. If anything, Jesse Eisenberg made him look too human -- though if he ends up being too busy to shoot the entire sequel, they could always use Michael Cera for the presidential campaign part.
Thumbnail: Sony Pictures Releasing, Facebook