Facebook disproportionately represents Black suspects, Stanford-affiliated study shows 

Dec. 7, 2022, 9:33 p.m.

Posts made on Facebook about crime by police departments overrepresent Black suspects by 25% compared to local arrest rates, according to a new study from Stanford Law School professor Julian Nyarko and his co-authors Ben Grunwald from Duke Law School and John Rappaport from University of Chicago. 

The researchers found that this overrepresentation of Black suspects increased with the proportion of Republican voters and non-Black residents in the region where the posts were made.

The study, which was published in November in “Proceedings of the National Academy of Sciences,” looked at 14,000 police departments in the United States and around 100,000 Facebook posts. The researchers used several algorithms to identify posts that have both race and crime descriptions. They then used AI language models to identify posts regarding crime and further used algorithms to determine which posts included descriptions of the suspect’s race.

“We already knew of studies that looked at if the media reports on crime, if the representation of Black suspects is in line with what happens on the ground, and the results of the empirical studies are conflicting,” Nyarko said. “But now that we have this channel of communication where there is no media outlet but rather the police communicating with the general public, we can basically look nationwide.”

Their study compared Facebook posts from police departments across the country with arrest and incident statistics from the FBI. However, the goal of the study was not to analyze the motivations behind the police posts, but rather to understand the justification behind including race descriptions in crime-related posts to begin with.

“We’re looking at the impact that reporting practices have, so we’re basically saying, irrespective of how you choose the posts — ‘you’ being the police department — the consequence of what you’re doing is the stigmatization,” Nyarko said. “So huge costs to minorities, specifically Black minorities.”

With more and more people relying on Facebook and other social media platforms for news, the study has major ramifications, Nyarko added. “When people consume social media, they should be aware of the biases,” Nyarko said. 

Along with increasing a negative racial bias, this study found that these posts can also impact policy making. Grunwald, a co-author on the study, said that when people are exposed to stories about crime that are racially coded, it can influence their support for different kinds of criminal justice policies.

“One thing that is worrying is that when police departments are, as we find, over-exposing people to post about Black suspects, that can activate racial stereotypes, and in turn, it can make them less supportive of progressive criminal justice policies that would make the criminal justice system function better and be more fair,” Grunwald said.

Stanford Law School visiting fellow Nasiruddin Nezaami, whose research focuses on data, the technology industry and recommendation algorithms, said this study also could impact people’s understanding of society — potentially influencing their very definition of society (Nezaami is not affiliated with the study).

“If you’re part of this new generation, and you don’t have sufficient societal information in your mind, your overall understanding of society is going to be articulated around the data that you’re getting from social media,” Nezaami said. “And when your sole exposure is social media such as Facebook and you don’t have a lot of social interaction, that means that’s where you get how you define society.”

Since the publication of this study, Nyarko is now working on a project with district attorney offices in California to implement an algorithm that takes race descriptions out of police reports, with the aim of preventing race, hair style or clothing from influencing prosecutors when making charging decisions.

Two district attorneys’ offices already utilize Nyarko’s algorithm to filter out race descriptions. But recent legislation now mandates that all district attorney offices in California must use the algorithm or another method of removing race descriptions from crime reports by 2025.

“The police write a crime report for the prosecutors, prosecutors read the report and then they say whether to press charges or not. There’s a concern that this is also affected by bias,” Nyarko said. “The question is, can we find any benefit? Is there any justification for including race descriptions?”

Lauren is the Vol. 264 managing editor for the sports section and a Science and Technology desk editor for the news section. Previously, she was a Vol. 263 desk editor, beat reporter and columnist. You can contact her at lkoong 'at' stanforddaily.com.Riya Narayan is a writer for The Daily. Contact them at news ‘at’ stanforddaily.com.

Login or create an account

Apply to The Daily’s High School Summer Program

deadline EXTENDED TO april 28!

Days
Hours
Minutes
Seconds