In the wake of Elon Musk’s $44 billion acquisition of Twitter, Stanford professors and research scholars in law, communications and cyber policy have said they are concerned about Musk’s interpretations of free speech and content moderation.
Musk, the richest person in the world with a net worth of $219 billion, bought Twitter for $54.20 a share in an all-cash deal on April 25. Initially attempting to evade Musk’s takeover, Twitter’s Board of Directors implemented a “poison pill” — a defensive shareholder rights plan that would have let current company shareholders purchase discounted stock to dilute the prospective owner’s stake. A mere 10 days later, the controversial deal was unanimously approved following a lack of competing bids.
Musk’s acquisition “likely signals a significant new — and unfortunate — direction for the company,” wrote law professor and Director of the Program in Law, Science & Technology Mark Lemley BA ’88 in a comment to The Daily.
Under Musk’s control, the future of free speech on the platform is uncertain, according to communications professor emeritus Theodore Glasser. Musk’s bid “is not good news for anyone interested in the nexus between free speech and responsible speech,” Glasser said.
Musk has historically been forthright in his intent to curb content moderation and is an outspoken supporter of unfettered free speech, despite claims that his push for free speech may only extend to him and his supporters. Still, Musk publicly presents as a free speech activist, tweeting, “free speech is the bedrock of a functioning democracy, and Twitter is the digital town square where matters vital to the future of humanity are debated,” on April 26.
Glasser, who described Musk as a “hardcore libertarian who equates free speech with free enterprise,” said he is concerned about Musk’s plans for the platform because Musk’s defense of free speech does not make exceptions for misinformation or hate speech on Twitter. Leaders of social media platforms should also demonstrate a commitment to public accountability and rationale for the policies they espouse, according to Glasser, and “Musk offers little reason to believe he’s interested in stopping hateful speech.”
Postdoctoral fellow in the Program on Democracy and the Internet Alessandro Vecchiato said that content moderation is a feature of social media that can be used for good, as it enables “a more positive notion of free speech, one in which every voice has space and can communicate without fear of retribution, violence and hate.”
In addition to having concerns surrounding the extent of the free speech that Musk calls for, Vecchiato said that the meaning behind Musk’s intentions are unclear. Vecchiato added that “more speech is not freer speech,” and that Musk’s conception of free speech “is a negative one,” in which the speech comes “from the elimination of any impediments.”
Another looming question is the status of one of Twitter’s most avid ex-users, former president Donald Trump, who was permanently suspended from the platform following the Jan. 6 insurrection. Under Musk’s leadership, Lemley said he anticipates “the return of Donald Trump to the platform and more circulation of lies and misinformation — precisely the opposite of the direction social media needs to head.”
Musk has also stated that he intends to share the Twitter algorithm to uphold the platform’s transparency. However, merely sharing algorithm design may not be sufficient to achieve trust and transparency because the new policies’ impact on the Twitter community holds greater significance, according to Vecchiato.
Some Twitter employees are considering fleeing the company in search of new workplaces. In the meantime, Executive Director of the Content Policy & Society Lab Julie Owono said that “the company will have to focus a lot of efforts on how to restore trust, or reinforce it.”
For Owono, the takeover speaks to broader questions in American society on the future of free speech on the internet.
“It is more than ever urgent that we shift the conversation about online content from whether we agree or not with the content, whether or not we should take it down or not, to the more fundamental question: What does it look like to limit and fight the bad on the Internet, without sacrificing our democratic values?” she said.