The whistleblower behind the “bombshell” disclosures that have rocked Facebook in recent weeks spent the majority of Tuesday’s three-hour hearing explaining to Congress how Facebook can be fixed.
While this was not the first time a Facebook critic briefed lawmakers, her insider knowledge and expertise in algorithm design made her especially effective. Her background as a member of the company’s civic integrity team meant she was intimately familiar with some of Facebook’s most serious issues.
During the hearing, Haugen went into great detail about Facebook’s algorithms and other internal systems, which have hampered the company’s efforts to slow misinformation and other problematic content. She also lauded the company’s researchers, calling them “heroes,” and suggested that Facebook make their work public.
Remove algorithmic ranking and go back to chronological feeds
Haugen’s expertise, which gives her a nuanced understanding of how algorithms work and the often unintended consequences of using them, was one of the most notable aspects of her testimony.
“I hope we will discuss as to whether there is such a thing as a safe algorithm,” Sen. Richard Blumenthal said at the start of the hearing. While Haugen never addressed that question directly, she did weigh on the ranking algorithms that power the feeds in Facebook and Instagram.
“She noted that Facebook’s own research has found that “engagement-based ranking on Instagram can lead children from very innocuous topics like healthy recipes… to anorexia-promoting content over a very short period of time.”
She also claimed that Facebook’s AI-powered moderation tools were far less effective than the company had publicly claimed. “We’ve seen from multiple documents in my disclosures that Facebook’s AI systems only catch a very small percentage of the offending content,” Haugen said. “In the case of something like hate speech, the best-case scenario is that they will only ever get to 10 to 20 per cent.”
To address this, Haugen suggested that Facebook switch to a chronological feed in which posts are ordered by recency rather than what is most likely to receive engagement. “I’m a big fan of chronological ranking, or ordering by time with a little spam demotion because I don’t think we want computers deciding what we focus on,” Haugen said.
She noted that Facebook would most likely oppose such a plan because content with higher engagement is better for their platform because it encourages people to post and comment more frequently.
“I’ve spent the majority of my career working on systems such as engagement-based ranking,” Haugen explained. “By coming to you and saying these things, I’m basically damning ten years of my own work.”
Reform Section 230
In a similar vein, Haugen suggested that Section 230 the 1996 law that shields companies from liability for what their users say and do on their platforms be reformed “to hold Facebook accountable for the consequences of their intentional ranking decisions.” She stated that such a law would most likely “eliminate engagement-based ranking” because it would become too costly for the company.
At the same time, she warned lawmakers not to let Facebook “trick” them into thinking that changing Section 230 would be sufficient to address the scope of the company’s problems. She also stated that using the law to police Facebook’s algorithms may be less difficult than addressing specific types of content. “Companies have less control over user-generated content, but they have complete control over their algorithms,” Haugen explained.
The emphasis on Section 230 is significant because lawmakers from both parties have proposed a number of changes to the law. During the hearing, Blumenthal stated that he, too, was in favour of “narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.”
Senator Amy Klobuchar has also proposed eliminating 230 vaccine and health misinformation protections. Meanwhile, Republicans have attempted to repeal Section 230 for a variety of reasons.
Slow down virality
Similarly, Haugen proposed that Facebook slow down its platform with “soft interventions” that add small amounts of friction to the platform. She cited Twitter’s “read before sharing” prompts as an example of a measure that can help to reduce the spread of misinformation.
“Small actions like that friction don’t necessitate choosing between good and bad ideas,” she explained. “All they do is make the platform less twitchy and reactive. According to Facebook’s internal research, each of these small actions reduces misinformation, hate speech, and violence-inciting content on the platform significantly.”
Facebook has previously taken these steps. Notably, it implemented these “break glass” measures in the days following the presidential election, though the company later reversed some of them. In the aftermath of the January 6th insurgency, the company made similar changes less than a month later.
According to Huagen, Facebook has mischaracterized these changes as harmful to free speech when, in fact, the company is concerned because it “wanted that growth back.” During the hearing, she stated that Mark Zuckerberg had been personally briefed on the potential impact of such changes.
However, she claims that he prioritized the platform’s expansion “over changes that would have significantly reduced misinformation and other inciting content.”
Open Facebook’s research to people outside the company
Access to Facebook’s data has become a hot topic in recent weeks, with outside researchers complaining that the company is stifling independent research. According to Haugen, the social network should work to make its own internal research available to the public.
She proposed that Facebook be allowed to keep its research under wraps for a set period of time, possibly as long as 18 months. However, the company should then make it available to those outside the company.
“I believe in collaboration with academics and other researchers that we can develop privacy-conscious ways of exposing radically more data that is available today. It is important for our ability to understand how algorithms work, how Facebook shapes the information, we get to see that we have these data sets to be publicly available for scrutiny,” Haugen said.
A dedicated ‘oversight body’
Aside from internal changes, Haugen stated that a dedicated “oversight body” with the authority to oversee social media platforms should be established. She suggested that a group like this within an agency like the Federal Trade Commission could provide “a regulatory home where someone like me could do a tour of duty after working at a place like this.”
“Right now, the only people in the world who are trained to analyze these experiments, to understand what’s going on inside Facebook, are people who grew up inside Facebook, Pinterest, or another social media company,” she explained.
This “oversight body” would be distinct from the Facebook-created Oversight Board, which advises the company on specific content decisions. While Facebook has said that the Oversight Board is proof that it is attempting to self-regulate, Haugen wrote in prepared remarks that the Oversight Board is “as blind as the public” when it comes to truly understanding what happens inside the company.
It’s also worth noting that Haugen has stated her opposition to attempts to break up Facebook. She stated that separating Facebook and Instagram would likely result in more advertisers flocking to Instagram, depleting Facebook’s resources for making platform improvements.
While it’s unclear which, if any, of Haugen’s recommendations Congress will act on, her disclosures have already caught the attention of regulators. In addition to providing documents to Congress, she has also given documents to the Securities and Exchange Committee.
She has alleged that Zuckerberg and other executives have “misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection,” according to SEC filings published by 60 Minutes.
Meanwhile, Facebook has continued to push back on Haugen’s claims. A week after an executive told lawmakers that “this is not bombshell research,” the company tried to discredit Haugen more directly.
In a statement, Facebook’s Director of Policy Communications Lena Pietsch said Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives.
We don’t agree with her characterization of the many issues she testified about.” Pietsch added that “it’s time to begin to create standard rules for the internet.”
In an appearance on CNN following the hearing, Facebook VP Monika Bickert referred to Haugen’s disclosures as “stolen documents” and said the company’s research had been “mischaracterized.”
Later that night, Zuckerberg publicly weighed in for the first time since The Wall Street Journal began publishing stories based on Haugen’s disclosures (Zuckerberg did once refer to earlier coverage of the scandals, complaining that a news article has mistakenly described his hydrofoil as an “electric surfboard.”)
In his first substantive statement, he said “many of the claims don’t make any sense,” and that “the argument that we deliberately push content that makes people angry for profit is deeply illogical.”
It could still get more difficult for Facebook to counter Haugen, though, particularly if new documents become public. Her letter to the SEC suggests that Facebook knew much more about QAnon and violent extremism on its platform than it let on, as Vice reported earlier.
Haugen may also make appearances in front of lawmakers in other countries, too. European lawmakers, many of whom have expressed similar concerns as their US counterparts, have also indicated they want to talk to Haugen and conduct new investigations of their own.