Simon Gwynn
Oct 4, 2021

Facebook whistleblower accuses company of putting profit before safety

Frances Haugen, who left the company earlier this year, was interviewed on CBS, claiming "incentives are misaligned" at Facebook.

Frances Haugen: former product manager at Facebook (LinkedIn)
Frances Haugen: former product manager at Facebook (LinkedIn)

The whistleblower whose leaks were the basis for The Wall Street Journal report series 'The Facebook Files" has said the social media giant prioritised "growth over safety" in an interview last night on CBS news show 60 minutes.

Before leaving the company earlier this year, Frances Haugen, a former product manager, copied tens of thousands of pages of Facebook internal research. 

The WSJ reports revealed that Facebook’s own research had found:

  • The company had different procedures in place for high-profile users who violated its rules

  • Instagram was having a damaging effect on the mental health of many teens users – although Facebook pointed out in a response that, on most measures, it was more likely to have a positive than negative impact.

  • Facebook founder Mark Zuckerberg resisted changes suggested by colleagues, fearing that they would harm engagement

In the interview, Haugen said: “Facebook, over and over again, has shown it chooses profit over safety. It is subsidising, it is paying for its profits with our safety. I'm hoping that [these revelations] will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place. That's my hope.”

Responding to this claim, Lena Pietsch, Facebook's director of policy communications, said: “The growth of people or advertisers using Facebook means nothing if our services aren't being used in ways that bring people closer together – that’s why we are investing so much in security that it impacts our bottom line.

“Protecting our community is more important than maximising our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13bn since 2016.”

Haugen said that when she joined Facebook in 2019, having previously worked at Google, Yelp and Pinterest, she took the job on the condition that she could work on tackling misinformation, saying that she had lost a friend to online conspiracy theories.

She was assigned to Facebook’s Civic Integrity unit, which worked to protect elections. But following the 2020 US election, it was dissolved – although Facebook says its work was distributed to other units.

Haugen’s lawyers have filed a series of complaints to the US Securities and Exchange Commission (SEC), on the basis that Facebook has withheld information that could negatively affect its investors.

John Tye, founder of legal group Whistleblower Aid, said: “As a publicly traded company, Facebook is required to not lie to its investors or even withhold material information. So, the SEC regularly brings enforcement actions, alleging that companies like Facebook and others are making material misstatements and omissions that affect investors adversely.”

In response to this, Pietsch added: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”

Despite her course of action, Haugen spoke in defence of Zuckerberg.

“I have a lot of empathy for Mark, and Mark has never set out to make a hateful platform” she said. “But he has allowed choices to be made where the side effects of those choices are that hateful, polarising content gets more distribution and more reach.”

She added: “It's one of these unfortunate consequences, right? No-one at Facebook is malevolent, but the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”

In a statement provided to 60 Minutes, a Facebook spokesperson said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”

 
Source:
Campaign UK

Related Articles

Just Published

27 minutes ago

Initiative wins Volvo's global media account, China ...

Account was worth $448.7 million in 2023.

4 hours ago

Creative Minds: How Yuhang Lin went from dreaming ...

The Shanghai-based designer talks turning London Tube etiquette into a football game, finding inspiration in the marketing marvels of The Dark Knight, and why he wants to dine with Elon Musk.

2 days ago

Happy holidays from team Campaign!

As the Campaign Asia-Pacific editorial team takes a holiday bulletin break until January 6th, we bid farewell to 2024 with a poetic roundup of the year's defining marketing moments—from rebrands that rocked to cultural waves that soared.

2 days ago

Year in review: Biggest brand fails of 2024

From Apple’s cultural misstep to Bumble’s billboard backlash and Jaguar’s controversial rebrand, here’s Campaign’s take on the brands that tripped up in 2024, offering lessons in creativity, cultural awareness, and the ever-tricky art of reading the room.