If Facebook seems like a breeding ground for trolling and vitriol, there's good reason: it's exactly that. Facebook itself came to that same conclusion as far back as 2018. And according to a new story from the Wall Street Journal, once executives were made aware just how irrefutable the problem was, they deliberately axed efforts to fix it.
Reporters for the Journal managed to get access to internal presentations. One in particular from 2018 was explicit, stating, "Our algorithms exploit the human brain’s attraction to divisiveness." It warned that "if left unchecked" the platform would push "more and more divisive content in an effort to gain user attention & increase time on the platform." This dovetailed with some of Facebook's earlier internal research, like its 2016 findings that "64% of all extremist group joins are due to our recommendation tools," with most of that driven by the "Groups You Should Join" and "Discover" algorithms. A separate presentation on that data was dire, reading, "Our recommendation systems grow the problem."
Facebook has designed its algorithms to reward "super sharers," giving much more influence and spread to people who "like," share, or otherwise engage more content. Generally, the most prolifically active users promote hyper-partisan content, and those users are so active, some online as much as 20 hours a day, that the company flagged them as likely working in shifts or simply being bots themselves. Internally, some Facebook executives tried to start something called "Sparing Sharing," a program which would stop giving these super sharers such outsized impact on what other people see. Facebook's own data scientists reportedly believed this would also help cut down spam and make the platform harder to manipulate to push misinformation.
But other executives resisted the idea, claiming, bizarrely, that it would affect Girl Scout troops trying to sell cookies if they had particularly zealous Facebook followers. Eventually, Mark Zuckerberg himself weighed in about "Sparing Sharing." And he gutted the program. Per the Journal:
The debate got kicked up to Mr Zuckerberg, who heard out both sides in a short meeting, said people briefed on it. His response: Do it, but cut the weighting by 80%. Mr. Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.
Other programs met similar fates—if they weren't killed outright, they were cut back to the point of uselessness.
Social media in general has helped fuel the rise of violent extremism in the U.S. YouTube has allowed its platform to become a hotbed of rightwing radicalization, thanks in large part to its suggestion algorithm that functions largely the same way that Facebook's does—and like Facebook it's done little to combat it. There's also not much incentive for these companies to do so, since the algorithms are designed to keep users online.
Facebook, for its part, seems to have little interest in actually addressing extremism, radicalization, or the polarization that leads to both. Doing so would, apparently, jeopardize the veneer of political neutrality that the company tries to maintain. In a 2016 call between the Washington and Silicon Valley offices, Facebook executive Joel Kaplan, a Republican and former George W. Bush White House official, said that they can't start more heavily moderating either fake or incendiary content "because it will disproportionately affect conservatives."
GQ