Internal evidence that former Facebook product manager Frances Haugen shared shows Facebook has known — but ignored — the harm it causes.
On Sunday evening, a former Facebook employee who has previously revealed damning internal documents about the company came forward on 60 Minutes to reveal her identity.
Frances Haugen, a former product manager on Facebook’s civic integrity team, shared documents that were the basis of an explosive series of articles in the Wall Street Journal. The reports revealed that the company knew its products can cause meaningful harm — including negatively impacting the mental health of teens — but it still has not made major changes to fix such problems.
“There were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money,” said Haugen in the 60 Minutes interview on Sunday.
The employee also shared new allegations — not previously covered in the WSJ’s extensive reporting — about Facebook allegedly relaxing its standards on misinformation after the 2020 presidential elections, shortly ahead of the January 6 riots at the US capitol.
In an internal staff memo obtained and published on Friday by the New York Times, Facebook’s vice president of public policy and global affairs, Nick Clegg, wrote that the responsibility for January 6 “rests squarely with the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them.” Clegg also wrote that Facebook is not a “primary cause of polarization.”
Even for Facebook, which has been mired in PR and political crises for the past five years — this is a staggering moment for the company and the billions of people who use its products. Already, in response to documents revealed by the whistleblower, the company has paused development of its Instagram for Kids product, brought two executives before Congress to testify, and launched a PR offensive dismissing the Journal’s reporting as “cherry picking.”
The whistleblower has also shared internal Facebook documents with lawmakers, and is expected to testify before members of Congress on Tuesday. The fact that the whistleblower is coordinating with Senators reflects how US lawmakers on both sides of the aisle are viewing social media companies like Facebook with more concern — and they’re becoming more adept at scrutinizing them.
“This is the first time I can remember anything this dramatic, with an anonymous whistleblower, this many documents, and a big reveal,” said Katie Harbath, a former director of public policy at Facebook who is now a fellow at the Bipartisan Policy Center and the Atlantic Council.
While plenty of Facebook employees have spoken out against the company anonymously or internally, only a handful — particularly at a high-ranking level — have ever spoken out on the record against Facebook. And never before have they revealed such strong evidence that the company seemingly understands but ignores systematic harms it causes.
Nor has a Facebook defector ever had this kind of press rollout: first, a series of investigative reports with a major publication, then an unveiling on primetime television, and soon testimony before Congress — all within the span of just a few weeks.
The extent to which Facebook seemingly knew about the harmful effects of its products and withheld that knowledge from the public has caused lawmakers such as Senator Richard Blumenthal (D-CT) to compare the company’s tactics to those of Big Tobacco.
Facebook has already responded to the allegations with a playbook defense, similar to its response to President *****’s criticism that the platform was “killing people” because of the spread of Covid-19 misinformation on the platform. The company and its leaders are arguing that the allegations are sensationalized and untrue, that information is being taken out of context, and that Facebook isn’t the only one to blame for the world’s problems.
And just like it did during the recent ***** and Facebook Covid-19 misinformation debate, Facebook has questioned the credibility of outside research about how its platforms function.
This time, the company went so far as to discredit some of its own internal researchers’ findings about Instagram’s negative effects on teenagers’ mental health, when last week it distributed an annotated version of the original research that was first published in the Journal. In its annotated slides, Facebook said that its own researchers’ slide titles “may be sensationalizing” findings about how Instagram can negatively contribute to teenage girls’ body image issues. The company also said the size of the study was limited.
The fact that the company is disputing the topline findings of its own staff’s research shows just how damaging the reporting coming out of the whistleblower’s documents are, and how urgently the company is moving to change the narrative.
“It is a big moment,” said Yaël Eisenstat, Facebook’s former head of Global Head of Elections Integrity Ops for Facebook. She has been a vocal critic of the company since she left in November 2018. “For years, we’ve known many of these issues — via journalists and researchers — but Facebook has been able to claim that they have an axe to grind and so we shouldn’t trust what they say. This time, the documents speak for themselves,” she told Recode.
A key reason why this latest scandal feels more significant is that Politicians on both sides of the aisle feel deceived by Facebook because they have previously asked CEO Mark Zuckerberg about Instagram’s mental health effects on children and teenagers, and the company wasn’t forthcoming at the time. At the time, Zuckerberg said he didn’t believe the research “was conclusive,” and that “overall, the research that we have seen is that using social apps to connect with other people can have positive mental health benefits” — but he did not disclose the negative findings in the research cited in the WSJ reporting, including that 13% of British teenage users and 6% of American teenage users studied who had suicidal thoughts traced the desire to kill themselves to Instagram.
The company also didn’t share the research in response to two separate inquiries by Rep. Cathy McMorris Rodgers (R-MA), and Senator Ed Markey (D-MA) when they asked for Facebook’s internal research on the matter after a March congressional hearing.
And more of Facebook’s current and former employees — instead of being quieted by the company’s reported tightening of communication among its staff — are starting to more openly discuss the company’s issues on Twitter, and within internal company settings like company message boards, according to reporting from the New York Times .
Some researchers working at the company feel “embarrassed” that Facebook dismissed the quality of its own staff’s work, according to The Times. Facebook, like other major tech companies, prides itself on hiring world-class researchers and engineering talent. If it further taints its image in the engineering and academic communities, it could limit the caliber of employees it’s able to recruit.
“I think Facebook is miscalculating what a watershed moment this is, not just because the public now has eyes on these documents, but because employees are starting to get angry,” Eisenstat told Recode.
In the coming days, the attention around the whistleblower will likely shift to include her personal story: her background, what she worked on at Facebook, whether she has any incentive to share this information other than the public good, and how she might face legal challenges or even retaliation for her actions (Facebook executives have testified under oath that they will not do so).
But the whistleblower coming forward is about much more than one individual. In revealing thousands of documents involving the work of many people at the company —which was subsequently largely ignored by top executives — this whistleblower has reignited longstanding debates both inside and outside the company about Facebook’s flaws.
“[The whistleblower] has provided an unvarnished and unprecedented look at the extent to which Facebook executives have knowingly disregarded the life-and-death consequences of their own products and decisions,” Jesse Lehrich, co-founder of the policy nonprofit Accountable Tech, told Recode. “And she’s paved the way for others to speak out.”
via Vox – Recode