A former Facebook product manager has become one of the company’s highest-profile critics after exposing thousands of internal documents she said showed the social media giant failed to protect users.
Frances Haugen, who tackled misinformation on the platform, turned over internal research to US lawmakers and the Wall Street Journal, which reported the company knew, but didn’t disclose, the negative impact of services like Instagram. She said she was sounding the alarm over the company’s practices after seeing repeated evidence that Facebook prioritises profits over the well-being of its users.
“There were conflicts of interest between what was good for the public and what was good for Facebook,” she told CBS News's 60 Minutes in her first public interview on Sunday.
“Facebook over and over again chose to optimise for its own interests like making more money.”
The revelations have ignited a firestorm for Facebook in Washington as lawmakers accuse the platform of covering up internal research about its negative effects. The trove of documents she handed over shed light on internal discussions about the company’s content moderation efforts, how it treats high-profile accounts differently from other users, and the mental impact its photo-sharing app Instagram has on young users.
One study Ms Haugen uncovered showed Facebook took action on as little as 3 per cent to 5 percent of hate speech on Facebook, and on less than 1 per cent of content classified under “violence and incitement,” according to 60 Minutes.
Ms Haugen is set to appear on Tuesday before a Senate subcommittee on consumer protection as part of a hearing focused on “protecting kids online.” Last week, lawmakers questioned Antigone Davis, Facebook’s global head of safety, over documents that showed Instagram can worsen the mental health of teens who are already suffering.
Facebook spokesperson Lena Pietsch, calling the 60 Minutes segment “misleading,” said in a statement that the company seeks to balance free expression with the need to keep the platform safe.
“We continue to make significant improvements to tackle the spread of misinformation and harmful content,” she said. “To suggest we encourage bad content and do nothing is just not true.”
Ms Haugen said she agreed to take the Facebook job so she could work against misinformation after seeing a friend get wrapped up in online conspiracy theories.
During her time at Facebook, she grew more alarmed by the choices the company was making to prioritize its own growth at the expense of the public, she said.
Included in the trove of documents Ms Haugen shared was a series of internal research slides outlining the impact that Facebook photo-sharing app Instagram has on teenagers, reported in September as part of a series of stories by the Wall Street Journal. The research showed that using Instagram often makes things worse for young people who suffer from existing mental health problems, such as anxiety or body image issues.
Her lawyers have also filed at least eight complaints with the US Securities and Exchange Commission, according to the 60 Minutes segment.
Facebook has pushed back on some of the Journal’s stories, claiming that data was “cherry picked.”
Still, the uproar that followed the reports led the company last week to halt plans to roll out a separate version of Instagram for children under 13, citing the need for further consultation with experts, parents and policymakers. Facebook says it’s not abandoning the idea of building the app entirely.
“I still think building this experience is the right thing to do, but we want to take more time to speak with parents and experts working out how to get this right,” tweeted Instagram head Adam Mosseri.
At a hearing on the topic last week, lawmakers blasted Facebook, arguing that the company has focused on profits ahead of efforts to make its products safer for kids.
“We do not trust you,” said Tennessee Senator Marsha Blackburn, the panel’s ranking Republican.