Facebook’s whistleblower tells Congress how to regulate tech

Stanford law professor Nate Persily, who has previously worked directly with Facebook on academic partnerships in the past and who has acknowledged the limitations of those partnerships, recently called for legislation that would compel platforms like Facebook to share internal data with external researchers.

Is allowance instantly strangers applauded

US lawmakers have been angry at Facebook for years. Since as early as 2011, they have raised alarms about Facebook’s failures to protect users’ privacy, its struggles combating misinformation on its platforms, and its impact on its users’ mental health. But they haven’t passed any new laws addressing those issues.

Now, some key legislators are saying they have the catalyst they need to make real change: whistleblower and former Facebook employee Frances Haugen.

Haugen, once a product manager at the company, testified before the Senate Commerce subcommittee on Consumer Protection, Product Safety, and Data Security in what lawmakers are describing as an urgent call to action to regulate Facebook. The whistleblower prompted a wave of media scrutiny of Facebook when she shared thousands of internal documents with the Wall Street Journal, the SEC, and Congress that show Facebook has known about the harms its products can cause but has downplayed this reality to lawmakers and the public. This proof, which has been missing from the conversation until now, reveals how Facebook conducted research that found its products can cause mental health issues, allow violent content to flourish, and promote polarizing reactions — and then largely ignored that research.

“I came forward because I recognized a frightening truth: Almost no one outside Facebook knows what happens inside Facebook,” said Haugen in her opening testimony on Tuesday.

In a statement in response to Tuesday’s hearing, Facebook’s director of policy communications Lena Pietsch wrote that “We don’t agree with her characterization of the many issues she testified about. Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”

Give Facebook real external oversight

Haugen repeatedly called for lawmakers to create an outside regulatory agency that would have the power to request data from Facebook, particularly about how its algorithms work and the kind of content they amplify on the company’s social media platforms.

“As long as Facebook is operating in the dark, it is accountable to no one,” said Haugen in her opening testimony. Haugen argued that “a critical starting point for effective regulation is transparency: full access to data for research not directed by Facebook.”

In her written testimony shared ahead of the hearing, Haugen criticized Facebook’s existing quasi-independent oversight board (which has no real legal power over Facebook) because she believes it is “blind” to Facebook’s inner workings.

Stanford law professor Nate Persily, who has previously worked directly with Facebook on academic partnerships in the past and who has acknowledged the limitations of those partnerships, recently called for legislation that would compel platforms like Facebook to share internal data with external researchers.

Create federal privacy laws to protect Facebook users

Privacy wasn’t one of Haugen’s key focuses during testimony, but several lawmakers, including Sen. Amy Klobuchar, Sen. Marsha Blackburn, and Sen. Ed Markey brought up the need for better privacy regulation.

Protecting people’s privacy on platforms like Facebook is an area in which Congress has introduced some of the most legislation so far, including updating the 1998 Children’s Online Privacy Protection Act (COPPA), the KIDS Act, which would force tech companies to severely limit targeting advertising at children 16 or younger, and the SAFE DATA Act, which would create user rights to data transparency and ask for opt-in consent for processing sensitive data. So it makes sense why this would be a key part of their potential plans to regulate Facebook.

“Passing a federal privacy standard has been long in the works. I put my first one in 2012 and I think it will be this Congress and this subcommittee that will lead the way,” said Blackburn.

Haugen agreed that how Facebook handles its users’ privacy is a key area of concern that regulators should focus on, but she also said she doesn’t believe privacy regulation is the only solution to mitigating Facebook’s harms to society.

“Facebook wants to trick you into thinking that privacy protections or changes to Section 230 alone will be sufficient,” said Haugen. “While important, they will not get to the core of the issue, which is that no one truly understands the destructive traits of Facebook except for Facebook. We can afford nothing less than full transparency.”

Reform Section 230 — but focus on algorithms

During the hearing, several senators brought up Section 230 — the landmark internet law that shields tech companies from being sued for most kinds of illegal content their users post on their platforms.

Reforming Section 230 would be highly controversial. Even some policy organizations like the Electronic Frontier Foundation and Fight for the Future, which heavily scrutinize tech companies, have argued that stripping this law away could entrench reigning tech giants because it would make it harder for smaller social media platforms with fewer content moderation resources to operate without facing costly lawsuits.

Haugen seemed to understand some of these nuances in her discussion of 230. She proposed for regulators to modify Section 230 to make companies legally liable for their algorithms promoting harmful content rather than specific users’ posts.

“I encourage reforming Section 230 decisions about algorithms. Modifying 230 around content — it gets very complicated because user-generated content is something companies have less control over. They have 100 percent control over algorithms,” said Haugen.