Are Algorithms to Blame for the Spread of Radical Ideas on Social Media? | PBS

By Katherine J. Wu
April 2, 2019

Global communications technology has streamlined the power of suggestion.

Behind every Facebook ad, Twitter feed, and YouTube recommendation is an algorithm that’s designed to keep users using: It tracks preferences through clicks and hovers, then spits out a steady stream of content that’s in line with your tastes. The longer you spend on the platform, the better the algorithm is doing—and the deeper down various rabbit holes you go.

Some of these forays are harmless; many end in nothing more than an innocuous kitten video or a list of things that will (and definitely won’t) survive your microwave.

But others can have more dire consequences. Guided by suggested hyperlinks and auto-fills in search bars, web surfers are being nudged in the direction of political or unscientific propaganda, abusive content, and conspiracy theories—the same types of ideas that appear to have driven the perpetrators of several mass shootings. That includes the alleged Christchurch gunman, whose white supremacist and Neo-Nazi fascist views appear to have motivated the slaughter of at least 50 people earlier this month.


Source: Are Algorithms to Blame for the Spread of Radical Ideas on Social Media? | NOVA | PBS | NOVA | PBS

Recent Related Items
Help inform the conversation
MediaWell relies on members of the public to submit articles, events, and research.