Facebook's algorithm is a sociopath, and Facebook management is too greedy to stop it
Facebook has always claimed that its mission is to bring people together, but a new report from The Wall Street Journal laid bare what many have suspected for some time: Its algorithms encourage and amplify harmful, antisocial behavior for money.
In other words, Facebook's algorithms are by nature sociopaths. And company executives have been OK with that for some time.
Here's what we learned from Jeff Horowitz and Deepa Seetharaman at The Journal:
A 2016 internal Facebook report showed "64% of all extremist group joins are due to our recommendation tools."
A 2018 internal report found that Facebook's "algorithms exploit the human brain's attraction to divisiveness" and warned that if left unchecked they would simply get nastier and nastier to attract more attention.
An internal review also found that algorithms were amplifying users that spent 20 hours on the platform and posted the most inflammatory content (users that may not be people at all, but rather Russian bots, for example).
Facebook executives, especially Mark Zuckerberg, time and time again ignored or watered down recommendations to fix these problems. Executives were afraid of looking biased against Republicans — who, according to internal reports, were posting the highest volume of antisocial content.
And of course executives had to protect the company's moneymaking, attention-seeking, antisocial algorithms — regardless of the damage they may be doing in society as a whole. Politics played into that as well.
People who suffer from antisocial personality disorder — known in popular culture as "sociopaths" — engage in harmful, deceptive behavior without regard for social norms. Sometimes this is done with superficial charm; other times this is done with violence and intimidation. These people never feel remorse for their behavior, nor do they consider its long-term consequences.
This is how Facebook's algorithms behave. It's how they hold on to users' attention and how, ultimately, the company makes money.
This runs contrary to what the company has been telling us about itself. After the bad rap it developed in the wake of the 2016 election, executives and the company's marketing machine were telling us that Facebook was both financially and culturally committed to encouraging pro-social behavior on the platform by doing things like removing violence and hate speech, making sure conspiracy theories and lies didn't go viral, and cracking down on opioid sales.
Now we know that that commitment was limited. Facebook would not kill the algorithms that laid the golden eggs despite their bias against these goals, or even clip their wings for that matter.
Facebook has claimed time and time again that it's a politically neutral platform, but it's not. Facebook, its executives, and its algorithms are political. They make political decisions not based on their values, but rather based on the interest of the company's survival.
According to The Journal, early on in the company's efforts to fight extreme and antisocial behavior on Facebook, it found that most bad behavior came from a small number of actors. It also found that were more right-wing accounts engaging in this behavior than left-wing accounts.
As a result, cutting down on bad behavior would hit right-wing Facebook harder. This made executives extremely uncomfortable. It didn't want to upset Republicans while President Trump was raging about the platform's bias against him and his party, so it did nothing.
Of course, the decision to do nothing is actually anything but. It is inherently political, favoring one party's interests over another, based on Facebook's political calculus for its own survival.
In September 2018, according to The Journal, managers told Facebook's employees that its newsfeed change its priorities, moving "away from societal good to individual value." Hostile posts targeting specific groups would not be taken down unless they violated Facebook's company rules — another leg up for extremists, especially those on the right wing posting in the largest volumes. In this way, Facebook's political concerns informed the algorithms themselves, like a loop.
Again, Facebook's politics are about its survival as a company — not about its stance on abortion or immigration or anything like that. They're about itself. That probably should have been obvious when the company's entire board went from heavy hitters with Democratic ties to heavy hitters with Republican ties over the course of the Trump administration.
When CEO Mark Zuckerberg waxes philosophical about his hands-off approach to content, this too is a political decision. It inherently gives power to political actors who will help Facebook keep on doing what it's doing, and Facebook knows that.
When Zuckerberg talks about how he would never fact-check Trump on his platform — even if Trump's lies threaten our face in the democratic process — it is important to remember that the Trump campaign is a Facebook client. And that client already used Facebook to engage in inherently antidemocratic voter suppression efforts back in 2016. Which is to say, Zuckerberg decided years ago that business comes first, and that protecting democratic principles is not within his purview. Consequences be damned.
Facebook the algorithm is a sociopath, and Facebook the company plays politics to protect that algorithm.
Facebook's interests are inherently out of line with those of our society's, yet it wields incredible political and social power. Thanks to Citizens United the company can use its billions to advocate for the sociopath's ability to sow discord for money. All of this while America is suffering from a debilitating deficit of trust. What a time to be alive.