© 2024 WMKY
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Alex Jones' defamation trials show the limits of deplatforming for a select few

Conspiracy theorist Alex Jones, seen here in 2018, and his network of websites have been banned from most major online and social media platforms but have still managed to bring in tens of millions of dollars in revenue.
Drew Angerer
/
Getty Images
Conspiracy theorist Alex Jones, seen here in 2018, and his network of websites have been banned from most major online and social media platforms but have still managed to bring in tens of millions of dollars in revenue.

A fresh defamation trial for conspiracy theorist Alex Jones that began this week could offer slivers of insight into the effectiveness of "deplatforming" — the booting of undesirable accounts from social media sites.

This trial, in Connecticut, is the second of three trials Jones faces for promoting lies on his streaming TV show and Infowars website that the 2012 Sandy Hook Elementary School shooting was a hoax. The victims' families, whom Jones called "crisis actors," have faced harassment, threats and psychological abuse. In August, a Texas jury awarded family members $45.2 million in damages, though Jones says he intends to appeal the decision.

Jones, a serial conspiracist and fabulist, was kicked offalmost all major internet and social media platforms in 2018 after he threatened then-special counsel Robert Mueller, who was investigating then-President Donald Trump's ties to Russia. Initially, a round of media coverage touted flagging traffic to Jones' websites as evidence that "deplatforming works." However, revelations from Jones' defamation trials may point to the existence of a rarified class of extreme internet personalities who are better shielded from efforts to stem the reach of their content.

In the Connecticut trial, a corporate representative for Jones' companies has testified that Infowars may have generated anywhere from $100 million to $1 billion in revenue in the years since the Sandy Hook massacre. Testifying during the previous trial in Texas, Jones told the court that Infowars earned around $70 million in revenue in the most recent fiscal year, up from an estimated $53 million in 2018, the year Infowars was broadly deplatformed.

The difference between Jones and many of the other right-wing actors who have been deplatformed, says political scientist Rebekah Tromble, who directs George Washington University's Institute for Data, Democracy & Politics, "is that Infowars had an existing infrastructure outside of social media."

Infowars makes about 80% of its revenue selling products, mostly dietary supplements, according to court filings from the largest of Jones' nine private companies. He grew his talk radio audience aided by an early partnership with a sympathetic distributor and now owns his own network and independent video-streaming site.

A growing body of research suggests deplatforming toxic actors or online communities does usually reduce audience size significantly, with the caveat that this smaller audience migrates to less regulated platforms, where extremism then concentrates, along with the potential for violence.

Gauging the effectiveness of deplatforming is complicated, in part because the word itself can refer to different things, says Megan Squire, a computer scientist who analyzes extremist online communities for the Southern Poverty Law Center.

"There's losing your site infrastructure, losing your social media, losing your banking. So like the big three, I would say," says Squire. She says they've all had different impacts depending on the specific case.

Squire's research shows that traffic to Jones' online Infowars Store remained steady for about a year and a half after he was removed from major social media sites. It then declined during 2020 until the lead-up to that year's presidential election and its violent aftermath, when the Infowars Store's traffic saw a massive spike that reached levels Jones hadn't seen since two years before his deplatforming.

Jones' resilience is more of an exception than the rule, says Squire. She points to the case of Andrew Anglin, founder of the neo-Nazi website The Daily Stormer. Following the violent 2017 Unite the Right rally in Charlottesville, Va., he lost his web domain and has had to cycle through 14 more, losing traffic each time. Squire says Anglin is on the run from various lawsuits, which include a ruling that he owes $14 million in damages for terrorizing a Jewish woman and her family.

Post-deplatforming survival strategies

Even after social media bans, conspiracists like Jones find workarounds. Squire says it's common for other users to host the banned personality on their own channels or simply repost the banned person's content. People can rebrand, or they can direct their audience to an alternative platform. After bans from companies including YouTube and PayPal, white supremacist livestreamer Nick Fuentes ultimately built his own video-streaming service where he encouraged his audience to kill lawmakers in the lead-up to the Jan. 6 Capitol riot.

Other internet communities have shown similar resilience. A popular pro-Trump message forum known as TheDonald was banished from Reddit and later shut down by a subsequent owner after the Capitol riot and yet is now more active than ever, according to Squire. When Trump himself was banned from Twitter, Squire watched as the messaging app Telegram gained tens of thousands of new users. It remains a thriving online space for right-wing celebrities and hate groups.

As for raising money, even if extremists are completely cut off from financial institutions that process credit cards or donations, they can always turn to cryptocurrency.

"100% of these guys are in crypto," says Squire, which, she notes, is not necessarily easy to live off. Its value is volatile, and cashing it in is not always straightforward. Still, Squire and her colleagues have found anonymous donors using crypto to funnel millions of dollars to Jones and Fuentes.

"We live in a capitalist society. And who says that entrepreneurs cannot be on the conspiracy side of things as well?" says Robert Goldberg, a history professor with the University of Utah. He points out that conspiracy peddlers have always been "incredibly savvy" with whatever fresh technology is available to them.

"The Klan Atlanta, Georgia, headquarters would sell hoods and robes and all this merchandise, this mark, this bling, if you will, to the 5 to 6 million people who joined the Ku Klux Klan in the 1920s," he says. But aside from the heyday of the KKK, Goldberg says, selling conspiratorial materials about the Kennedy assassination, UFOs or the 9/11 terrorist attacks has generally been far less lucrative, until now.

Power and lies

A bigger question for researcher Shannon McGregor at the University of North Carolina's Center for Information, Technology, and Public Life is what conspiracy entrepreneurs hope to achieve with their reach.

"Why are these people doing this in the first place? What are they getting out of it? And in a lot of cases in this country in particular, in this moment, it's about hanging on to power," says McGregor. Fringe communities always exist in democracies, she says, but what should be concerning is their proximity to power.

She rejects a "both sides" framing of the issue, identifying it as primarily a right-wing phenomenon that dates back decades. "Since like the Nixon era, at least, this right-wing, ultraconservative media ecosystem has been aligned with political power, makes it much more unlikely that it will actually go away," says McGregor.

Deplatforming and punitive defamation lawsuits, she argues, are less of a solution than "harm reduction." When one individual conspiracist or conspiracy site loses its audience, replacements quickly emerge. None of this means, McGregor and other experts agree, that efforts to contain the spread of extremist or anti-democratic narratives should be abandoned altogether.

"I think overall, [social media company] representatives would prefer if the conversation became, 'Oh, well, deplatforming doesn't work, right? ... So, you know, this isn't our responsibility anymore,'" says Tromble.

Squire says there's no doubt that anything that makes it harder for toxic conspiracists to operate smoothly or spread their message is worth doing. It makes the platform they're removed from safer and bolsters the social norm that there are consequences for harassment and hate speech.

Copyright 2022 NPR. To see more, visit https://www.npr.org.

Lisa Hagen
Lisa Hagen is a reporter at NPR, covering conspiracism and the mainstreaming of extreme or unconventional beliefs. She's interested in how people form and maintain deeply held worldviews, and decide who to trust.