There is no doubt that social media, particularly Twitter, has served as a powerful accelerant and amplifier for the proliferation of extreme ideologies, for Salafi-jihadism and for far-right and far-left worldviews. The purging of tens of thousands of pro-Trump, conservative, or conspiracy theory social media accounts in January 2021 brought back memories of earlier purges.
From March 2012 to January 2015, I headed a small U.S. inter-agency organization focused on combating jihadi propaganda, including stemming the rising flood of content by the now-notorious terrorist organization called ISIS, or the Islamic State.
Not all of our work was online, but much of it was. Among other lines of effort, we had small groups of Arabic-, Urdu- and Somali-speaking operators contesting online space against supporters of jihad. We always felt, and were, outnumbered and outgunned in these digital skirmishes. ISIS even had a special unit set up to use the terms of service of social media companies, and spamming of their opponents, including the U.S. government, to silence us. It seemed that there was "too much ISIS material being disseminated too rapidly." In late 2014, according to scholars J.M. Berger and Jonathon Morgan, ISIS supporters had between 46,000 and 70,000 Twitter accounts. It was only in September 2014 that Twitter slowly began to take some ISIS accounts down, not because of hate speech or incitement, but because they were showing graphic images, including videos of beheadings of Western hostages.
Why had it taken so long for Twitter to take action against a group that had already killed thousands and gloated about it online while promising more? What had been the reasons for the slowness of the U.S. government response in pushing to get them banned? There were several factors. It sounds naïve today, but there was, at the time, a certain libertarian mindset among social media companies that these should be unfettered platforms.
As for an American government promoting Internet freedom, there was the concern that pressuring social media companies to take down content (aside from things that were clearly illegal, like child pornography) would empower authoritarians in Russia, China, and Iran. Law enforcement also was in favor of leaving online content online, in order to trace networks and gain insight on terrorist plotting. The fact that most of this material (90%) was in Arabic also meant that taking it down was of less urgency for Silicon Valley.
Eventually, by late in 2015, jihadi swarms were driven from Twitter, only to regroup and flourish on Telegram. Suppression on Telegram caught up with them by 2019. But to this day, these groups really have no problem in getting out their core message – even if they have to rely on lesser-known platforms like Rocket.Chat and others.
It is not my intention to draw any moral correlation between the removal of partisans of a Foreign Terrorist Organization (FTO) in 2015 and the January 2021 closing of objectionable accounts of American citizens – including that of the sitting U.S. president – by an American company. I am more interested in who should make such decisions and why. Twitter claimed that in most of these cases it was cracking down on accounts promoting misinformation and violence, especially those associated with the QAnon conspiracy theory and the January 6th U.S. Capitol riot, and that it was prompted by fears of further unrest. But business decisions to solve a local problem can have significant global implications.
The banning of Trump has been decried by some surprising voices, including Putin adversary Alexei Navalny, Mexico's leftist President Lopez Obrador, and German Chancellor Angela Merkel – none of them partisans of the defeated U.S. president. There is longstanding concern in Europe over the monopoly power of American social media companies. And as if the U.S. was not divided and tense enough, the Trump dump was preceded by Twitter's blocking of a major American newspaper's dissemination of a negative news story about the Democratic candidate's son weeks before Election Day, and followed, in January 2021, by what looked like Big Tech colluding in silencing Parler, a right-leaning media platform. Taken together, these three steps, meant to supposedly show corporate responsibility, instead conjured the specter of corporate overreach.
There are some lessons from all of this, from the ISIS media takedown to today's actions. The ISIS social media purge did work, in that it removed a massive and seductive online presence. But it did not "defeat" the ideological worldview of the Islamic State, which persists. Counter-terrorism experts back then also wrestled over where to draw the line between supposed "extremist views" and violent extremists, and it is all too easy to step over that line. It was, after all, the Obama administration that used the No-Fly List as counter-terrorism tool against American Muslims. The struggle against ISIS was a boon to regimes – to authoritarianism, to surveillance and control, and to the national security state worldwide.
Other similarities between jihadi movements and far-left/far-right groups are that these are becoming challenges about how to deal with broader intellectual currents rather than specific organizations – loose movements rather than specific groups. Not surprisingly, neither for-profit corporations nor democratic Western governments have proven to be very good at the ideological game. Generally, for both companies and governments, censorship has been a simpler path to take than ideological sparring, as the fraught experience of deradicalization and counternarratives over the past decade indicates.
Outsourcing these hard decisions to powerful and greedy corporations seems foolhardy. These issues are difficult enough for genuinely democratic governments to tackle (which often tackle them badly anyway). And even thinking of possibly treating large sections of your own population like supporters of an FTO is a dangerous temptation.
Nurtured, perhaps naively, by First Amendment dreams, it has been an American tradition to scorn European steps to police online speech. I recoil just thinking about it. But the Europeans are troubled both by years of American incendiary online speech and by the recent Trump ban. American and European concepts of free speech will always differ but perhaps we are not that different when it comes to concern about the power of monopolies. The policing of the digital space by sovereign democratic states through transparent processes, which can conceivably be subject to review by voters at the ballot box, seems far preferable at this juncture to hermetic and partisan decision-making processes by billionaires and corporations accountable to no one. Both restrict speech – a regrettable decision to be taken judiciously.
Entirely legal procedures taken by private corporations implementing their terms of service can both be completely understandable given the circumstances, and still problematic. Removing overt incitement to violence is a must, but deciding what that is, who does it, and under what rules of the game it is done is a discussion worth having – and one not to be left solely to the Big Tech plutocrats.
If there is any policy that needed a bipartisan (or Atlanticist) political dialogue, it is this one.
*Alberto M. Fernandez is Vice President of MEMRI.
 Smallwarsjournal.com/jrnl/art/isis-religious-and-extremist-propaganda-social-media-dictionary-based-study-twitter#:~:text=The%20reason%20why%20ISIS%20uses%20social%20media%20and,world%20is%20managed%20by%20users%20with%20many%20followers, October 22, 2020.
 Nytimes.com/2021/01/11/technology/twitter-removes-70000-qanon-accounts.html, January 11, 2021.
 MEMRI Daily Brief No. 78, MEMRI VP Alberto M. Fernandez In Congressional Testimony Today: 'After San Bernardino: The Future Of ISIS-Inspired Attacks,' February 10, 2016.
 Foreignpolicy.com/2013/07/30/cyber-jihadists-state-department-now-in-full-blown-twitter-war, July 30, 2013.
 Aymennjawad.org/15782/the-story-of-shami-witness, December 12, 2014.
 MEMRI Inquiry and Analysis No. 1198, Jihadis Shift To Using Secure Communication App Telegram's Channels Service, October 29, 2015.
 Crestresearch.ac.uk/resources/how-telegram-disruption-impacts-jihadist-platform-migration, January 8, 2021.
 Nbcnews.com/tech/tech-news/twitter-bans-michael-flynn-sidney-powell-qanon-account-purge-n1253550, January 8, 2021.
 Stratechery.com/2021/internet-3-0-and-the-beginning-of-tech-history, January 12, 2021.
 Cnbc.com/2021/01/11/germanys-merkel-hits-out-at-twitter-over-problematic-trump-ban.html, January 11, 2021.
 Americancompass.org/the-commons/the-ramifications-of-a-regime-level-politics, January 11, 2021.
 Npr.org/2020/12/10/945000341/supreme-court-says-muslim-men-can-sue-fbi-agents-in-no-fly-list-case, December 10, 2020.
 Algemeiner.com/2019/12/17/the-london-terror-attack-exposes-deradicalization-programs-failures, December 17, 2019.
 Politico.eu/article/thierry-breton-social-media-capitol-hill-riot, January 10, 2021.
 Wsj.com/articles/the-progressive-purge-begins-11610319376, January 10, 2021.