Experts talk Facebook Oversight Board, Trump suspension in Stanford Cyber Policy Center panel

May 6, 2021, 11:26 p.m.

After the Facebook Oversight Board decided to uphold the platform’s suspension of former president Donald Trump’s account, board members and experts scrutinized the process at a Thursday panel discussion hosted by Stanford’s Cyber Policy Center. Some panelists defended the Oversight Board’s creation, while others questioned Facebook’s ability to uphold international human rights standards.

Two members of the board, Michael McConnell, director of the Stanford Constitutional Law Center, and Julie Owono, an international human rights lawyer, joined the conversation to share insights on the deliberation process and the role of the Board in the company.

Following the Jan. 6 insurrection in the U.S. Capitol, Facebook indefinitely suspended Trump’s account. Trump encouraged the rioters, claiming on Facebook and other social media sites that he had won the 2020 election.

Enter Facebook’s Oversight Board, a quasi-legal body created in 2019 to help govern speech on Facebook’s platforms. As Facebook CEO Mark Zuckerberg put it in an interview with Vox, the Oversight Board is “like a Supreme Court,” making content moderation decisions for Facebook and Instagram. The Board consists of 20 members from around the world, including lawyers, former prime ministers and journalists. 

Two weeks after deciding to indefinitely suspend Trump’s account, the company said it would refer the case to the Oversight Board. After two months of deliberation, the Board upheld the takedown but advised in its decision that “it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension” and criticized the social media giant for seeking to “avoid its responsibilities” by not applying and justifying a defined penalty.

Though the Board ultimately gave Facebook six months to issue a final verdict, Owono made it clear during the discussion that they had not simply “pushed the baby back to Facebook.” She noted that the Board has presented Facebook with a clear set of human rights standards to use when reexamining the case, specifically noting the need to consider “potential for future harms.”

Alex Stamos, a professor at the Center for International Security and Cooperation and former chief security officer at Facebook, defended the Oversight Board. Stamos identified three “fundamental problems” with Facebook which he believes the Oversight Board addresses: a lack of transparency, unpredictability in company decisions and the proximity between the people deciding company policy and those handling government affairs. 

“Of all the horrible options, this seems like the least bad option,” Stamos said.

Marietje Schaake, international director of policy at Stanford’s Cyber Policy Center and international policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence who previously served as a member of the European Parliament from the Netherlands, had a less optimistic view of the Board: “What I see is a company with outsized power where the Oversight Board has a limited mandate to address limited cases and it takes a long time to address them.” 

“Facebook has the choice, with or without its Oversight Board, to decide to draw stronger principled, moral lines every day, and they’ve not done that,” Schaake said. She pointed to Facebook’s removal of posts criticizing the Indian government’s handling of COVID-19 and to Myanmar, where members of the military used Facebook to incite ethnic cleansing. 

Renee DiResta, research manager at the Stanford Internet Observatory, highlighted the dangers of Facebook’s “newsworthiness exception,” a policy which protects some content from removal if it is “newsworthy, significant, or important to the public interest.” 

DiResta called this exemption an “indemnification for the powerful” since their speech is “inherently newsworthy” and therefore receives a “higher standard of protection.” In an op-ed for the Columbia Journalism Review, DiResta argued for adherence to the “Peter Parker principle” — the larger a user’s audience, the greater the scrutiny they should be subject to.

Stanford Law professor Nathaniel Persily, who helped establish the Stanford-MIT Healthy Elections Project, also asked panelists about the extent to which company interests factor into Facebook’s content moderation decisions. McConnell distinguished Facebook’s profit-seeking institutional structure from that of the Oversight Board, which is indifferent to Facebook’s profitability and able to emphasize human rights, “of which freedom of expression is near the apex.”

Later, Schaake and Stamos clashed over whether Facebook is the correct actor to make content moderation decisions as they relate to human rights.

Facebook, according to Schaake, is “a commercially driven, data-slurping, advertisement-selling, billion dollar company” that is not “legitimate or qualified” to make these decisions. Instead, Schaake advocated for greater government intervention in the form of “democratically-anchored” accountability systems. 

Stamos pointed out that Facebook had adhered to all considerations of legality: The Indian government had legally requested the company remove content critical of the BJP, India’s ruling party, from their platform. Moreover, he argued, the types of legal guidelines for Facebook would vary greatly from country to country.

DiResta asked Owono and McConnell about the deliberation process and whether the Board considered comments from the public. Both Board members admitted they did not have time to read every single one of the over 9,000 comments they received, but were grateful for the input.

Michael Alisky '24 is a contributing writer for the News section studying international relations and computer science. He is from Aurora, Colorado, and enjoys backpacking and playing chess in his free time. Contact The Daily’s News section at news ‘at’ stanforddaily.com.

Login or create an account