Introduction
How important is the internet to the far-right provocateurs who use it? After the 2017 “Unite the Right” rally, the city of Charlottesville threw the book at its self-described white nationalist organizers. They discovered that the Unite the Right march was planned in Facebook groups and chats, advertised using memes from 4chan and Reddit, spread like wildfire across Discord servers and gaming lobbies, and was virtually attended through live-streams, forums, and chat rooms. In response, activists, scholars, and policymakers have pushed technology companies to regulate far-right content on their platforms. Social media platforms like Facebook and YouTube removed offensive accounts, such as ones for the Proud Boys, in a tactic called “deplatforming.”
Other web companies went even further than content moderation. Some far-right websites were pulled from the internet by their service providers for terms of service violations. After the Charlottesville rally, the domain registrar Network Solutions denied services to Stormfront, the oldest white nationalist forum on the internet. When service providers deny their services to hate websites, they briefly break the connection between the sites and the World Wide Web, resulting in significant losses of users and income. An early consensus emerged among scholars and activists alike: this process of “deplatformization,” or the systematic denial of internet services to toxic communities, decimated the alt-right’s ability to mobilize in the years following Unite the Right (Hayden et al., 2022; Thompson & Hawley, 2021).
On the surface, this assessment appears correct. After deplatforming, several core alt-right websites lost over half of their web traffic, like the Daily Stormer blog (Squire & Gais, 2021). Some hotspots have been completely eliminated as of the time of writing, such as the alt-tech platform Parler. This hypothesis also reflects developments in the analog world. The alt-right groups that showed up in Charlottesville were not meaningfully present at the January 6th insurrection in 2021, nor did they mobilize during the following “Stop the Steal” campaigns. Should we interpret these signs of obsolescence as reason to believe that tech companies can clean up after themselves?
My research suggests otherwise: deplatforming overlooks how much power far-right provocateurs have already built on digital media. This article dives into three examples – PayPal, Gab, and Telegram – to explicate how this new mode of platform power strengthens threats to democracy represented by the far-right. Like every infrastructure, the internet must be maintained in order to stay in working order. But unlike many other infrastructures, the internet is not counted as a public utility. Instead, it is owned and operated by private companies. When billionaire Elon Musk purchased Twitter, nothing could stop him from upending the social network’s content moderation policies, re-platforming white nationalists who had been deplatformed between 2017 and 2022. The internet is governed by the biggest owners, and reactionaries are working to own as much of the web as they can.
Platform governance versus public relations: The case of PayPal
As the public grew aware of the symbiosis between far-right provocateurs and digital platforms, some firms treated the situation as a public relations crisis (Hill, 2023). Surprisingly, the payment processing giant PayPal was among the first to begin deplatforming alt-right and other far-right users from their services. On August 16, less than a week after the Unite the Right rally, PayPal released a statement explaining that it had suspended dozens of hate sites (Blake Montgomery [@blakersdozen], 2018). One year later, PayPal updated its Acceptable Use Policy to expand its ability to act against the hate speech of the online far-right. The firm cited this policy when it suspended the accounts of January 6th Capitol rioters.
PayPal’s fast response was surprising for two reasons: one practical, and the other ideological. Practically, the presence of white nationalists on PayPal does not immediately affect the user experience of others on the application. From this, we can infer that PayPal did not deplatform to improve their product for customers, but rather to protect their company’s reputation. Second, ideologically, PayPal was founded by Peter Thiel, a conservative libertarian who, in his own words, “no longer believe[s] that freedom and democracy are compatible” (Sandifer & Graham, 2017). Thiel bankrolls far-right endeavors, from conservative influencers in New York City to start-ups promising oceanic fiefdoms for billionaires, to influencing presidential candidate Donald Trump’s choice of vice presidential nominee in the 2024 election. Even though Thiel stepped away from PayPal in 2002, he retains considerable influence. He is the center of what insiders term the “PayPal Mafia,” a network of powerful investors and engineers, including Elon Musk, who oversee platforms like YouTube, Yelp, and Reddit.
Should we interpret PayPal’s actions against right-wing extremism as steps in the right direction? Unfortunately, using the same policy, PayPal also suspended the accounts of anti-fascist individuals and organizations in 2018. When pressed about these suspensions by journalists, PayPal reiterated its new acceptable use policy against “other forms of intolerance that is discriminatory.” Against whom did these antifascists discriminate? Without citing a specific incident that caused the alleged violation, PayPal suggests that organizing against racism produces the same kind of violence as Unite the Right, an event that resulted in the death of an innocent civilian and an unquantifiable spread in hate speech.
PayPal’s both-sides logic reveals the dangers of appealing to technology companies to regulate the far-right. In the antifascist groups, PayPal saw a threat to its reputation among conservative and libertarian investors, individuals like Thiel and Musk who actively finance far-right platforms and cultural networks (Silverman, 2022). A few years later, these same investors would return to hamper the firm’s governance. In October 2022, PayPal further revised its Acceptable Use Policy to prohibit users from posting content that “promote[s] misinformation.” This TOS revision occurred against the backdrop of widespread mis- and disinformation campaigns concerning the 2020 presidential election and vaccines. Joined by former PayPal president David Marcus, Elon Musk initiated a #PayPalCancelled hashtag campaign. While a few other platforms were receiving similar complaints at the time, PayPal’s critics were well organized and owned significant shares in the company. As a result, PayPal stock dipped, and the policy was reversed.
When technology firms use content moderation as an opportunity to reinforce the company brand, they equate the health of the public sphere with the confidence of customers and investors. Consequently, the whims of ultra-rich individuals are weighed more heavily than the social needs of the public. Like Thiel, PayPal tinkered with regulating its product – only to conclude, in practice, that the freedom of investors is more important than the health of our democracy.
Parallel infrastructure for the imperiled: The case of Gab
When far-right groups get deplatformed, they usually find more obscure sites on the internet to call home. However, when no suitable alternative exists, entrepreneurs create their own platforms. Derived from the alt-right moniker, “alt-tech” describes a network of digital platforms that caters to deplatformed users and communities. Researchers often describe alt-tech platforms as “moderation free,” but this characterization is imprecise. All platforms moderate content that threatens profitability. Alt-tech platforms are different in that they advertise their decision not to moderate hate speech.
Enter Gab: an “alt-tech” social network that describes itself as “The Free Speech Social Network and Pioneer of The Parallel Economy.” Established in 2016 as a safe space for far-right users removed from mainstream social networks like Facebook and Twitter, Gab markets itself as a censorship-free social network with low-to-no restrictions against hate speech. Since its foundation, Gab has enjoyed considerable expansions in its features, now offering AI chatbots and image generators, a payment processor (appealing to PayPal exiles), marketplace, and video.
Gab has survived numerous deplatformization attempts. This signals that far-right technology entrepreneurs are becoming adept at securing their own digital infrastructure. Most infamously, on October 27, 2018, Robert Gregory Bowers massacred 11 Jews at the Tree of Life synagogue in Pittsburgh. Before doing so, he announced his intention on his verified Gab account. Bowers was conditioned toward violence within an antisemitic ecosystem intentionally fostered on Gab (Than et al., 2020). After the massacre, technology companies took notice. Gab was blocked from PayPal, GoDaddy, Backblaze, and several other service providers. Its infrastructure short-circuited, Gab disappeared from the internet two days after the shooting.
But Gab’s deplatforming only lasted six days. On November 4, the website found a new domain registrar, Epik. Epik has a history of intentionally marketing its services toward deplatformed far-right websites. Similar to alt-tech platforms, Epik encoded reactionary appeals in its moderation guidelines. Until 2023, its terms of service prohibited customers from being “acutely sensitive to controversial material” or “not respect[ing] non-censorship.” These policies signaled the firm’s intention to host far-right websites like Gab who use the language of “non-censorship” to tacitly advertise toward reactionaries who see themselves as “victims of censorship.” Although Epik would go on to denounce Gab in 2024 in an effort to distance itself from racism, it still hosts a range of lesser known alt-tech websites.
Gab is creating a blueprint for a reactionary parallel internet. The platform has only grown in size and sophistication following its suspensions, and allegedly rakes in between $1.6 to $4.5 million in annual income – much of which is reinvested into the platform’s “cancel-proof” infrastructure. Gab has found alternatives to PayPal in Bitcoin and its own payment processing service, Parallel; to the cloud storage service Backblaze in the web host Sibyl Systems; and to the Google and Apple app stores using a fork of the open-source infrastructure Mastodon. This “stack” of digital services can be replicated by future reactionary webmasters.
Pro-democracy researchers should not wait for Gab to find an alt-tech ISP before we begin asking serious questions about “whack-a-mole” deplatforming. Gab’s ambition is to create a “parallel internet,” or a suite of services completely unaccountable to the regulatory ambitions of Big Tech firms, though it has yet to fully achieve this independence. Nevertheless, even at their least successful, alt-tech platforms afford entrepreneurs lucrative opportunities to expand their portfolios. At their most successful, these platforms become hubs where reactionaries exchange ideas, news, and tactics with each other and an increasing mainstream audience enthralled by the false promise of anti-censorship.
DMs against democracy: The case of Telegram
The early consensus on deplatforming omitted the duplicity of complicit digital platform companies, such as PayPal, and the resilience of alt-tech infrastructure, exemplified by Gab. Additionally, it continues to overlook a tendency among the far-right to produce new networks using whatever channels are available to them.
Since at least 2018, U.S. far-right networks have found a new base of operations on Telegram. Telegram is an encrypted instant messaging service – indeed, one of the largest messaging apps in the world, with over 800 million monthly active users. Its origin story reflects its pro-privacy brand. Released in 2013 by Pavel and Nikolai Durov, Telegram was made after the brother-tycoons behind Russian social media giant VK refused to hand over user data to the Kremlin. Telegram has since built a brand around its refusal to monetize or hand over user data to law enforcement. (Notably, these promises have not been kept – see this 2022 Privacy Report.) The app insists that its lack of content moderation is necessary so it can serve citizens of authoritarian countries – a fantasy with which U.S. far-right groups identify as they claim to be censored on the internet.
In the United States, Telegram functions similarly to an alt-tech platform due to its hospitality toward hate speech. After the 2021 deplatforming of alt-tech site Parler, newfound interest rose so quickly that Telegram became the most downloaded smartphone app in the world for several months, beating WhatsApp and Messenger. Telegram now serves as the main channel for a wide range of U.S. reactionary communities, including Patriot Front (the largest white nationalist organization in the United States), the “Active Club” network of white nationalist mixed martial arts clubs, the “White Lives Matter” network, eco-fascists, anti-vaxxers, Christian nationalists, and numerous neo-Nazi terrorist cells like the Clockwork Crew.
Telegram’s privacy interface empowers political provocateurs with leverage over their local communities. It affords a suite of security options, backed by cryptographic algorithms. Combining technical features with victimhood fantasies, self-described Nazis ensure that all of their operational plans take place within secret chats, inaccessible to researchers and their scraping programs. Only by subpoena can researchers, law enforcement, and the public learn the content of these messages – requiring a hate crime to take place before the process of discovery even begins. When Unite the Right was planned in 2017, critical operational details were publicly accessible in Facebook posts. Now, when white nationalists plan their next move, they do so in an algorithmic fog of war.
If this affinity continues, the next alt-right might be formed in a digital enclave of secrecy, out of sight of regulators until it is too late to intervene. Telegram has shown no interest in proactive content moderation, in part because doing so might empower censorship in the authoritarian countries where most of its global users live. The same can be said of messaging apps like Discord and Signal – but Telegram outpaces both in terms of the quantity and political extremism of its users. Hence, the problem is not that encrypted messaging exists. The problem is that a titanic technology company like Telegram can profit by curating a perfect environment for extremism while flagrantly ignoring its consequences.
Conclusions and recommendations
Like every infrastructure, the internet must be maintained in order to stay in working order. The problem that may define our democracy is not whether the internet is maintained, but by and for whom. I join a growing chorus of voices suggesting that the problem of reactionary media can only be addressed by recompiling the internet as a democratic utility (Fuchs, 2021; Krasodomski-Jones, 2020; Tarnoff, 2022).
A democratic internet is not so far afield. Imagine, for example, if the decision to block Gab was never made by the CEO of GoDaddy because his employees, acting as a union, refused to allow the firm to work with Gab in the first place. Imagine if content moderation were conducted by unionized, trained professionals rather than by AI chatbots and grossly underpaid workers held to company secrecy. Imagine if Cambridge Analytica never existed, because the data it mined was already publicly available and transparently presented.
Unfortunately, a vast gulf separates the current state of tech policymaking from a democratic internet. The most sophisticated efforts remain stuck in the logic of deplatformization. For example, the Digital Services Act empowers the European Union to fine platforms that host harmful content. These fines make alt-tech platforms less profitable over time. However, enforcement is under threat by a new wave of far-right parliamentarians whose parties have been probed under the DSA (Scott, 2024; Tar, 2024). Human Rights Watch further reports that DSA has already been leveraged to censor Palestinian speech on Facebook. It is difficult to understate the damage a far-right parliament might do by intentionally weaponizing the DSA against their political adversaries. The problem with the DSA is the same as the problem with PayPal’s acceptable use policy: proper enforcement is trumped by the private interests of the enforcers.
A digital tech deal might bridge the gap between our current dictatorship of platforms and a democratic web to come. As Michael Kwet (2022) outlines, a digital tech deal (DTD) would systematically diminish the power of private technology companies to make room for grassroots, decentralized, open-source, and abolitionist tools to power the internet from below. DTD recognizes that the far-right is merely a symptom of the threat to democracy represented by platform capitalism, so it prioritizes re-distributing ownership and sustainable stewardship of infrastructure. The DTD vision includes socializing internet infrastructure such as cloud servers, subsidizing open-source and interoperable platforms, and implementing public subsidies to compete with and ultimately replace private investment in digital media production. Ongoing projects demonstrate the potential of DTD: from the British Digital Cooperative, a proposal to shift platform production from multinational companies to local communities, to Ethan Zuckerman’s Initiative for Digital Public Infrastructure which researches and builds digital platforms for an open and public internet.
DTD does not promise a panacea that it cannot provide. It will not “kick the bad guys off the internet.” Rather, it invites policymakers, activists, and researchers to stop chipping into the system of platform capitalism that keeps far-right provocateurs relevant. By increasing the power of local communities to shape the web, DTD makes space in the stack, and in our imaginations, for new ways to moderate content, regulate data, and participate in online public life.
Bibliography
Blake Montgomery [@blakersdozen]. (2018, November 9). PayPal statement: “We carefully review accounts and take action as appropriate. We do not allow PayPal services to be used to promote hate, violence, or other forms of intolerance that is discriminatory.” Said a lot of companies are grappling w free expression vs tolerance. [Tweet]. Twitter. https://twitter.com/blakersdozen/status/1060969921704747008
Fuchs, C. (2021). The Digital Commons and the Digital Public Sphere: How to Advance Digital Democracy Today. Westminster Papers in Communication and Culture, 16(1), Article 1. https://doi.org/10.16997/wpcc.917
Hayden, M. E., Gais, H., Miller, C., Squire, M., & Wilson, J. (2022, August 11). “Unite the Right” 5 Years Later: Where Are They Now? [Southern Poverty Law Center]. Hatewatch. https://www.splcenter.org/hatewatch/2022/08/11/unite-right-5-years-later-where-are-they-now
Hill, S. (2023). ‘Definitely not in the business of wanting to be associated’: Examining public relations in a deplatformization controversy. Convergence, 13548565231203981. https://doi.org/10.1177/13548565231203981
Krasodomski-Jones, A. (2020). The Liberal Democratic Internet—Five Models for a Digital Future (pp. 44–65) [Digital Policy Lab Provocation Paper]. Institute for Strategic Dialogue. https://www.isdglobal.org/wp-content/uploads/2021/04/EN-DPL-Companion_Papers-12Apr.pdf
Kwet, M. (2022, May 31). Digital Ecosocialism: Breaking the power of Big Tech. Transnational Institute. https://www.tni.org/en/article/digital-ecosocialism
Sandifer, E., & Graham, J. (2017). Neoreaction a Basilisk: Essays on and Around the Alt-Right. Eruditorum Press.
Scott, M. (2024, February 14). The EU’s online content rulebook isn’t ready for primetime. POLITICO. https://www.politico.eu/article/european-union-digital-services-act-dsa-thierry-breton/
Silverman, D. (2022, October 18). The Quiet Political Rise of David Sacks, Silicon Valley’s Prophet of Urban Doom. The New Republic. https://newrepublic.com/article/168125/david-sacks-elon-musk-peter-thiel
Squire, M., & Gais, H. (2021, September 29). Inside the Far-right Podcast Ecosystem, Part 3: The Rise and Fall of ‘The Daily Shoah’ [Southern Poverty Law Center]. Hatewatch. https://www.splcenter.org/hatewatch/2021/09/29/inside-far-right-podcast-ecosystem-part-3-rise-and-fall-daily-shoah
Tar, J. (2024, June 6). What a right-wing shift in the EU Parliament means for tech policy. Euractiv. https://www.euractiv.com/section/digital/news/what-a-right-wing-shift-in-the-eu-parliament-means-for-tech-policy/
Tarnoff, B. (2022). Internet for the People: The Fight for Our Digital Future. Verso.
Than, N., Rodriguez, M. Y., Yoong, D., & Windel, F. (2020). Welcome to Gab Alt Right Discourses (arXiv:2007.09685). arXiv. https://doi.org/10.48550/arXiv.2007.09685
Thompson, J., & Hawley, G. (2021). Does the Alt-Right still matter? An examination of Alt-Right influence between 2016 and 2018. Nations and Nationalism, 27(4), 1165–1180. https://doi.org/10.1111/nana.12736