News Item

Facebook Can’t Fix Itself Because It Doesn’t Want To | New York Magazine

In light of Facebook executives appearing before Congress this week (and the related controversy over the platform’s unwillingness to ban Infowars), Brian Feldman argues that the company has a fundamental misunderstanding of political bias and partisanship in the media landscape. He also says that it is a misunderstanding that is unlikely to change.

If you look at the situation quantitatively, as a question of what outlets are being shared on Facebook, its decision to divide publishers into “liberal” and “conservative” is understandable. But qualitatively, as many studies have shown, those “liberal” outlets (which include everything from the New York Times to Mic) occupy a broad range of positions, while the “conservative” sites cluster on the far-right end of the spectrum. As long as Facebook’s understanding of the politics of journalism is this facile, it’s never going to fix its misinformation problem.

[…]

One problem is that Facebook doesn’t need to do anything because it faces no substantial competition in the social media sphere. It is a monopoly that controls a substantial portion of the internet — and, of course, in a vacuum, we wouldn’t want an absolute power like that to ban pages outright. But, of course, we are not in a vacuum. Facebook is, through its size, business incentives, structure, and inaction, enabling a vast conspiratorial far-right propaganda machine. If that was the natural outcome of “free speech,” as it’s been understood in this country for two hundred years, we’d have had a lot more Donald Trumps.

The endless back and forth on this issue — Facebook hemming and hawing as its efforts do little to stem the tide of crap on its platform — points to at least one larger issue, which is that News Feed is simply a bad product. Facebook would not have this issue to anywhere near the extent that it currently does if it did not have the algorithmically ranked News Feed. Its continued pushing of the Groups product is a way to get users to act as moderators, a tactic that keeps most online communities relatively small, sane, and detoxified. The News Feed, however, is moderated entirely through a black box of algorithmic designations and human-moderator judgments, the lack of transparency leading everyone to wonder if they are being punished or if their posts just suck (in Congress today, representative Steve King literally asked Facebook why Gateway Pundit was not receiving more traffic). The product is fatally flawed. So long as Facebook continues to play judge, jury, but not executioner in secret, it will continue to face criticism from all sides. Luckily, all the company needs to do is smile and nod.

Source: Facebook Can’t Fix Itself Because It Doesn’t Want To | New York Magazine