A June report from the Stanford Internet Observatory found that deplatforming alt-right users on social media sites following the events of Jan. 6, 2021 helped boost the popularity and income of fringe “alt-tech” platform, Gab.
The Stanford Internet Observatory, a program within the Freeman Spogli Institute and Stanford Law School’s Cyber Policy Center, conducts research on the abuse of information technologies such as social media. According to the Observatory’s report, Gab, an alternative tech platform founded in 2016, has focused on a far-right audience and engages in extremely laissez-faire moderation efforts. The lack of moderation has resulted in a hotbed of conspiracy theories on the platform. The report also notes that Gab saw a spike in growth immediately after Jan. 6, 2021, when mainstream social media sites like Twitter or Facebook engaged in widespread deplatforming of malicious accounts.
David Thiel, the chief technical officer at the Observatory and an author of the report, said that, as accounts are forced to move from these sites to Gab, they can reach an audience that, while smaller, is a lot more receptive to their content. “That’s part of what makes it hard to determine how much you’re limiting real world spread by shifting [users] to another platform,” Thiel said.
Gab’s newfound popularity led to between 1.6 to 4.5 million dollars in annual income for Gab, which has helped fund their efforts to branch out with other services. So far, these efforts include a video-sharing platform and an advertising affiliate network, which would allow the platform to raise funds via ads rather than user donations or subscriptions, according to Thiel.
As Gab has gained a larger presence in the right-wing social media ecosystem, the platform has also received interest from foreign parties. Renee DiResta, research manager at the Observatory, said that the Russia-based Internet Research Agency has run accounts on Gab. She added that Russian tactics have included “sowing division and amplifying existing partisan rancor.”
Engaging on a platform like Gab where users have already telegraphed their political positions creates an opportunity for foreign assets “to form more direct contacts with particular types of partisans,” DiResta said.
Gab’s growth has also led to a greater real-world presence, including directly funding a white nationalist conference and the organization of trucker convoy protests through the platform itself, according to the Observatory’s report.
Fellow alternative social media site Parler was also used to organize real world events — culminating in the Jan. 6 insurrection, in which it was used as a major hub for coordinating events and communication. Parler also saw an influx in users following harsher content moderation efforts by Twitter and Facebook.
As for Gab, while Thiel believes deplatforming toxic accounts ultimately hurt the alt-right movement, he said that there are still other forms of effective content moderation social media platforms could take, such as limiting the discoverability and shareability of toxic accounts. “This can take the form of down ranking results from people that are repeat spreaders of misinformation,” he said.
Another way to prevent toxic content could be to look beyond social media. Joan Barata, a fellow at the Stanford Cyber Policy Center, said that while social media is one platform by which malicious actors can disseminate disinformation, these actors will often use “different types of platforms or media to reach out to the public.”
Generally, promoting transparent content moderation and avoiding legislation that might force social media platforms to over-censor content are some ways to limit disinformation while still maintaining freedom of expression, he said.
According to Thiel, Gab may see another surge of growth during an election cycle, as new patterns of disinformation emerge. However, even with aggressive moderation and further account suspensions on mainstream social media sites, Gab is unlikely to see growth “as explosive as the Jan. 6 spike,” he added.
The Observatory’s report concludes that deplatforming ultimately helped a fledgling Gab stay afloat, while noting that more research on whether deplatforming effectively reduces toxic messaging “may prove crucial to developing more holistically effective Trust and Safety practices at the platform, infrastructure and regulatory levels.”