A disinformation platform with ties to the Jordanian government used new social media platforms to spread pro-monarchy and pro-military narratives, according to a report by the Stanford Internet Observatory.
Facebook suspended the Jordanian network on July 8, marking the first time an inauthentic presence from Jordan has been suspended by a social media platform, the report said. And while the disinformation network wasn’t especially complex — it featured few innovative strategies to attract viewers, and its content was limited — it utilized modern forms of social media to promulgate partisan narratives, sparking interest among Stanford researchers.
“They were doing the same things we always see these networks doing — creating fake news pages, those sorts of things,” Shelby Grossman, a research scholar at the Observatory and the lead on this project, said. “But then there were also novel tactical aspects to this operation.”
Before the network reached Grossman and her team, it was identified by both Facebook and Twitter for “coordinated inauthentic behavior.” While the content the network was posting was not illegal on the social media platforms — it was opinionated, partisan and political, but not falsified — the network involved fake accounts and failed to disclose it was linked to a Jordanian government actor.
“Here the content was really just like, the King of Jordan is great. That’s not objectionable — it’s just like an opinion,” Grossman said. ”So they’re identifying these networks based on behavior, not content.”
When Facebook first uncovered the Jordanian network, the social media platform sent the accounts to Grossman and her team, among other third-party researchers, before the network was suspended. Researchers commonly receive access to such networks to provide further insight into the narrative, tactics and off-platform activity of such accounts.
The Stanford team included scholars, research assistants, analysts, undergraduates and a high school student from The Nueva School.
“I delegate different parts of the network to different people,” Grossman said. “One person will become the Instagram expert, and they are the expert on the Instagram presence.”
The network was suspended first on Facebook, where it had the largest following. Although 14 of the 35 Facebook pages did not have any followers, four garnered over 80,000, with one amassing 317,068 followers. Many of the pages were created in late 2020, while two pro-army pages began in 2015.
The network was also present on Twitter, but the content did not catch on, and Twitter is not a novel social media platform for disinformation. Twitter banned the accounts in late June or early July, and followed their policy of making all suspended content public to any viewer.
After investigating further, however, the team learned that the network was also present on TikTok, a video-sharing platform particularly popular among youth. The adaptation to TikTok inspired the team to investigate further.
“We talked to a lot of people, and we feel pretty confident that this is the first time ever there has been political, coordinated, inauthentic behavior on TikTok that has been publicized,” Grossman said.
Grossman and her team were surprised. Given the prevalence of the platform and its popularity among impressionable youth, she said, she would have expected to encounter illegal activity sooner.
TikTok, however, does not have the same transparency policies that Facebook and Twitter have. While it refuses to tolerate accounts that “exert influence and sway public opinion while misleading individuals and our community about the account’s identity, location, or purpose,” it does not release accounts to researchers or make content public. In fact, the social media accounts shut down by Facebook and Twitter are still activated on TikTok as of Aug. 2, despite the publication of the Stanford report.
Grossman’s team was not originally aware of the network’s presence on TikTok. On a whim, in the middle of their investigation, one researcher decided to reverse image search some of the network’s Facebook profile pictures. Using Yandex, a Russian alternative to “Google,” they were able to find TikTok profiles with the same stock image.
The network’s engagement on TikTok was underwhelming. While the presence of these accounts was significant per se, it was “unclear why there was subsequently such minimal effort to generate engagement with the content,” the report stated.
For instance, a TikTok page with the false profile of “selenakhalidi” had just 3 followers. And although two of the videos had almost 500 views, the content was limited. Two different TikTok pages, created under the fake personas of Manal Aldajaa and Fahed Hammad, also received very little attention.
In addition to resharing content on TikTok, the Jordanian network made use of recorded audio from Clubhouse, a video-chatting platform that spiked in popularity over the past year. Like the TikTok discovery, Grossman and her team believe this is the first time suspended actors used Clubhouse as a weapon of disinformation.
The audio, inferred to be a recording of actors outside of Jordan discussing the disgraced Prince Hamza, was used by the network to expose foreign interference in the affairs of the monarchy, the report concluded. Jordan Open Source Association, a non-profit promoting open networks, found that Jordanians can access Clubhouse only through a virtual private network (VPN).
“They’re claiming that this Clubhouse discussion represents what they call fourth-generation social media warfare,” Grossman said. “They are trying to spread the narrative that Jordan is under attack from nefarious foreign social media people.”
While social media suspensions can breed retaliation — the Turkish president once threatened to ban Twitter after pro-government accounts were taken down — Grossman said that the relative insignificance of this network will limit the fallout. She does hope, though, that unearthing these accounts could prompt TikTok to both actively monitor the platform for disinformation campaigns, and to share such networks with researchers for further investigation.
“All platforms have this problem; all platforms have disinformation campaigns. So, it really shouldn’t be embarrassing to the platforms when they’re uncovered,” Grossman said. “It should be a good thing that they looked for them, found them and then were transparent about them.”