Facebook doesn’t just provide a platform for those who spread disinformation and toxic discourse — it actively incentivizes those behaviors in order to hook users on its product. That’s the disturbing allegation by company whistleblowers and others, as highlighted in congressional testimony Tuesday.
If in fact Facebook is promoting what amounts to high-tech dog-fighting to stoke anger among its users for the sake of profit, that cannot be allowed to stand without regulatory intervention. Congress can’t censor content on the site, but it can certainly demand that a business that gets help from federal law must implement safeguards to disincentivize this behavior.
No one who uses Facebook regularly could deny it’s an angry place these days, fraught with conspiracy theories, misinformation and hatred aimed at fellow Facebook users in America and abroad. It’s tempting, if depressing, to dismiss that anger as just another example of the nation’s generally toxic political culture today. But what if rather than being just a symptom of that toxicity, Facebook is encouraging it?
That’s the claim of Frances Haugen, a former Facebook employee turned whistleblower who alleges that Facebook isn’t merely a sounding board for all that anger, but that the company actively promotes it for the addictive qualities that keep users coming back. She and others say the company’s algorithms — the computerized criteria that determine what content pops up on users’ screens most often — are tilted toward outrage.
Haugen told a Senate committee Tuesday that the company’s “engagement-based ranking” system gives higher rankings to content “more likely to get clicks” and that will “give little bits of dopamine” to users. The real-world impact, she said, includes giving politicians and their supporters incentive to be nasty rather than constructive in their ads and posts: Nasty rises to the top of the content pile and is seen by more users, while calmer, more constructive discourse gets lost in the algorithmic noise.
The reason Facebook and other tech giants can host unfiltered content that might get them sued if they were traditional publishers is that Section 230 of the Communications Decency Act provides them with special legal protection. The idea was to allow the internet to grow without being stifled by litigation. But if the result is a cesspool of anger for the sake of insane levels of profit, why keep that federal shield in place? That’s leverage Congress should use.
Facebook can’t change human nature, including that dark part of it that craves conflict. But the company can change its algorithms to punish rather than reward bombast, bullying and aggression online. It would likely lead to less obsessive use of its platform by the public — a fistfight will always get more attention than a constructive conversation — which in turn could mean lower profits. But society would be richer for it.
Editorial by STLtoday.com
Distributed by Tribune Content Agency, LLC.
Send questions/comments to the editors.
Comments are no longer available on this story