Q&A: Frances Haugen still believes in Facebook

March 3, 2022, 10:06 p.m.

Facebook whistleblower Frances Haugen visited Stanford on Thursday, calling on students to advocate for transparency and accountability in technology. The Daily sat down with Haugen afterward to talk about civil disobedience, misinformation and what drove her to blow the whistle that reverberated throughout Silicon Valley. 

During Haugen’s time as a product manager at Facebook’s civic misinformation team, she assembled thousands of pages of internal documents that she later provided to The Wall Street Journal. Her work revealed that the company knew far more than it let on publicly about the harm the platform was causing some of its users, spurring a wave of public outrage.

Haugen’s return to Silicon Valley was organized by campus activist group Fossil Free Stanford, in conjunction with SumOfUs, an external non-profit advocacy group that works on corporate accountability. Representatives from both groups said they hoped Haugen would serve as a powerful example to Stanford students and potential future Facebook employees. 

“We’re working with students and starting to build the idea that you actually have agency in your life, and that your intellectual labor and your work labor is something that you can put where you think could do the best in the world,” said SumOfUs organizer Melissa Byrne. 

This interview has been condensed and lightly edited for clarity.

The Stanford Daily [TSD]: Your revelations about Facebook illuminated the harm its products have inflicted on young people in particular. Could you tell me more about this?

Frances Haugen [FH]: The people who are most impacted by these things are young people. Both on the level of harm — it’s young people who are being harmed most right now — and, they’re also being almost entirely excluded from the conversation on what to do. They’re also the ones who have to live with the consequences longest.

I think the thing that most people don’t realize is the kids who are being the most hurt are usually also the kids who are the most isolated. One of the things that came from the disclosure is that Facebook’s own studies found that for the people most at risk for being hyper-exposed to misinformation, the biggest risk factors were being recently widowed, being recently divorced or moving to a new city.

The part that I think most people don’t understand is that once you begin to believe facts that are not part of consensus reality, it makes it harder for you to reintegrate into the community.

TSD: Your decision to whistleblow has been attributed to lots of different facets of your education and life experience, including your early career as a debate champion and being raised by an Episcopalian minister. What experiences do you think were most relevant to shaping your ethical decision-making? 

FH: People only try to change if they believe change is possible. Fatalism is seductive, right? It is much, much easier to say, “Didn’t we lose this fight 10 years ago?” Or, “Don’t we already know Facebook is horrible?” Fatalism is so seductive because it gives you permission to do nothing. 

At Olin, I did my thesis on civil disobedience against Civil Defense. It used to be that everyone would have to go underground or in the shelters for 15 minutes every year. So there was a woman named Mary Sharmat who just couldn’t put up with anymore. She had this brand new baby, she felt like it was madness. If nuclear war is going to kill us, we shouldn’t be pretending we can sit in our basements and be fine. She left the baby with her husband and went and sat on a park bench while they did duck and cover. 

And the next day, she opened up the newspaper and on the second page was this amazing photo of this woman named Janice Smith sitting on a park bench in front of City Hall. And she has an infant in her arms, and she has a toddler standing on the bench next to her. There’s this police officer with a billy club looming above her — great composition. And Mary Sharmat was so relieved to believe she wasn’t alone that she went and called through all the Smiths in the New York City phonebook until she found Janice Smith. 

And it turned out there were two more women in the Bronx that had done the same thing she did. And they all started organizing. And the next year, it was like 100 women, and the year after that it was 500 women, and the year after that it was going to be two or three thousand people in New York City that were gonna be protesting, and they stopped doing the Duck and Cover. 

TSD: Did you have a Janice Smith — someone whose activism directly empowered yours? 

FH: No. 

TSD: Do you hope to be that for someone down the line?

FH: My core mission has been for years to wake up people to their own agency. I can’t give you agency. You already have it. You forget it. We get told by lots and lots of people that we don’t have power, but the number one thing that actually allows you to change is to believe that change is even possible. 

TSD: Speaking of agency —  many students on campus today will be employees at big tech companies in a few years. How much power does the average Stanford student wield, when it comes to tech reform? What kinds of ethical responsibilities accompany that?

FH: People who are technologists need to understand that they have profound responsibility, because these systems are opaque. And particularly software systems, because most of the ones that we’re talking about are closed. 

I’m not just trying to reach CS graduates — I’m trying to reach the friends of CS graduates. Because if Facebook knows that I have been priming the current friends of Facebook employees… it’s a lot harder to mislead your employees when people are supported. Facebook needs more employees. It needs more good people working there. There’s lots of great people working there today, but it always needs more. And those people need to be supported in order to feel like they have options.

If these systems are becoming more and more powerful, and they’re closed, and the only people who can even understand them are people who work at them — you can’t get a master’s degree in these things — you have to ask yourself, “Well if this can actively impact society, and no one except for the employees can understand it, then you have a unique responsibility because no one else can save it. And if you don’t come forward, they’ll never know.” 

TSD: Do you still believe that meaningful reform can happen internally at Facebook? What’s the alternative?

FH: Facebook is the internet for most of the world. That’s the other thing that Stanford students have to understand, is that Facebook killed the open internet for a billion people. There is no alternative to Facebook for basic information in the most fragile places in the world.

In Africa, their free basics program touches like 600 million people. That’s a lot of people. We can’t give up on Facebook, because the market is not going to go in there and dislodge Facebook in Africa. People need to know, there are millions of lives on the line, because we’ll look back at this and say, “We had a chance to do something.”

Even something as simple as saying, “You have to publish all the posts that get seen by more than 100,000 people in every country in the world.” 

When I was at Facebook, one of the first moments where I was like, “Oh, this is so much worse than I thought it was,” was the social cohesion team — which is the team that fights genocides, because genocide is what happens when social cohesion breaks down — they would do a thing called virality review where they would have language specialists translate for us the top 10 posts that we were seeing in the tier one at-risk countries every two weeks. And they were all horrific. It’s all severed heads, all kinds of crazy stuff, propaganda. If they just had to publish all the posts that got seen by more than 100,000 people, we would be outraged. We need to have open sourced initiatives that are owned by the public. There are different ways that you can help, but we just can’t leave Facebook behind.

TSD: Even after standing up against Facebook, you’re still a believer in social media’s potential to be a force for good. Why?

FH: Facebook has lots of solutions, and they choose not to use them.

Facebook’s running experiments where they show you more content from your family, friends, and for free, you get less nudity, less violence, less misinformation. Because the problem is not people. The problem is amplification. And I think that’s the thing I always come back to is that we can have social media that is about our family and friends, that is human-centric, that allows us to connect with people we care about. We just can’t have one where every year our usage of it grows 2%. 

If Facebook was forced to publish much, much, much more data about operations, it would push back toward a thing that was about your family and friends, that was less about this hyper-amplification, because they couldn’t get the safety numbers to improve in their current way of doing things. 

But until the incentives change, Facebook will not change, and we have the levers to change those incentives. Facebook has the assumption that people will never stand up to it. The first step is for us to be able to define reality independent of Facebook. And then the second step is for us to act on that. 

We feel like we’re powerless, but we haven’t even begun. We haven’t even begun to fight.

Grace Carroll is the Vol. 261 magazine editor. She was previously a news desk editor. Contact her at gcarroll 'at' stanforddaily.com.

Login or create an account