Youth advocates call for tougher social media regulation

Published Oct. 21, 2024, 12:49 a.m., last updated Oct. 21, 2024, 12:50 a.m.

Social media companies should face stricter regulations, tech accountability advocates said at a Thursday panel, citing the adverse impacts of social media on young adults’ mental health. 

Design It For Us, a youth-led coalition advocating for safer social media platforms for children and young adults, hosted the panel. The coalition began in 2022 as a campaign supporting the California Age Appropriate Design Code — a bill that requires companies to consider the potential harm their products may pose to young users — and has since grown into a coalition that drives policy reforms for social media child safety.

Zamaan Qureshi, the coalition’s co-chair, moderated the panel, which featured Arturo Béjar, the senior engineering and product leader at Facebook, Vicki Harrison, program director for the Stanford Center for Youth Mental Health and Wellbeing and Sneha Revanur ’26, founder and president of Encode Justice, an organization which mobilizes young people to advocate for AI safety and fair use. 

Béjar said he once believed tech companies cared about the wellbeing of their users. That changed when he had a conversation with Facebook’s chief product officer, who Béjar said knew the exact percentage of users experiencing bullying and harassment on the platform but did not push the company to take measures to remedy the issue.

The social media companies can fix these problems, Béjar said. “They have the technology, and they have the people, but they choose not to.” 

Béjar is now helping lawmakers create a safer social media experience for all users by sharing what he learned from his time at Facebook in essays and recommending concrete steps that social media companies can take to protect their users.

Revanur, the founder of Encode Justice, said during the panel that even if employees recognize potential dangers within their company’s products, they are disincentivized from speaking up. Revanur added that policy is necessary to shift this element of tech culture.

“There is a consistent narrative that we have to immunize companies from any sort of liability,” Revanur said, adding that instead, we “need to have structural processes in place to keep users safe.”

Harrison, who heads projects on social media and mental health in young adults at Stanford, said the most important step in changing companies’ product design would be to incentivize health and safety. 

The speakers proposed solutions for a redesign of a safer social media experience while keeping in mind the interactions that young users want. Harrison said young people want to choose the content they see. Users trying to decompress and have fun should not have to see unsolicited jarring or explicit images, she added. 

“When you go to a library, you can pick what you want to read, and you don’t have people throwing books at you as soon as you walk in,” Harrison said. “Social media should be similar.”  

The general public is already in support of regulating technology companies, Revanur said, and tapping into support from demographics which activists have not previously reached out to is the key to drawing more support to new legislation. 

Harrison said it is important for companies to listen to input from young users. 

“Young users are more of the experts on social media than people in the Capitol,” she said. 

Béjar advised young people hoping to enter the technology industry to take care of themselves and to “have a north star,” explaining that he has seen enough passionate people be crushed by companies. 

“Find an environment where your ability is supported and where you can build the future you want to live in,” Béjar said.

Aspen Singh writes for News. Contact the News section at news "at" stanforddaily.com.

Login or create an account