Facebook whistleblower tells Senate inquiry polarizing ads are cheaper to run

Facebook whistleblower Frances Haugen told the Senate inquiry into social media and online safety today that polarizing ads are cheaper to run and get more engagements.

Haugen told the survey: “Facebook says that an ad that gets more engagements is a higher quality ad. And research has shown that angry, divisive and polarizing ads are cheaper to run than ads compassionate. [ads], or those trying to find common ground or educate people. We cannot have a democracy when discourse that is divisive, polarizing and extreme is five to ten times cheaper than discourse that tries to understand how these facts work.

Frances Haugen appeared before the inquest

Facebook’s advertising policies state, “Ads may not contain content that exploits crises or controversial political or social issues for commercial gain.” Advertising policies also prohibit inflammatory content and implications of personal attributes. According to Facebook’s Business Help Center, the metrics used by Facebook Advertising to determine success include engagements as well as creative asset quality and conversion rate.

ADVERTISING

The Senate investigation is part of Prime Minister Scott Morrison’s review of mental health and social media. The investigation is separate from the bill to target anonymous users for potentially defamatory comments.

In 2021, Haugen leaked internal Facebook documents to the the wall street journal alleging that Facebook prioritized profit over user safety. Facebook categorically denied this.

Haugen reiterated his claims to the inquiry, saying, “The thing I’ve seen time and time again, inside Facebook, is that Facebook was dealing with compromises. Small decisions regarding: are you willing to lose half a percentage point, 0.1% in exchange for less misinformation? Facebook shows again and again to make these compromises on the side of its own balance sheet.

“Mark Zuckerberg’s 2018 white paper – it’s called ‘Demoting Borderline Content and Integrity Strategy for the 21st Century’,” Haugen added. “Facebook was already aware back then, in 2018, that more extreme content was considered more appealing by algorithms. That it got reactions and attracted more engagement than more moderate content. Regardless if it’s left, if it’s right, if it’s hate speech, if it’s nudity, people are drawn to interact with more extreme content because it’s just part of our brains.

Haugen also expressed cynicism about Facebook’s ability to self-regulate.

“We don’t let kids score their own test,” she said.

The former Facebook employee championed what she called the One Two Three model.

“Part of the reason I care about this process is to heal the public’s relationship with Facebook, because the public right now has a lot of hostility towards Facebook, because they don’t trust what said Facebook. So I advocated for what I call the One Two Three, or company, community accountability plan. So the first step is the company. The company should have to do a risk assessment, a public risk assessment that says, we believe it’s the risks and harms of our products…it’s not good enough because Facebook isn’t diverse. It’s geographically isolated, very privileged. We have to associate that with something , say a regulator where we need to go and [speak to] Community groups. When you talk to parents, when you talk to paediatricians, we need to talk to NGOs, civil society groups, talk to children and say, what do you think are the harms of this product? »

Haugen continued, “So the company, the community gives us a pretty solid picture of what we perceive as damages and that has to be coupled with liability. And so responsibility for me takes two forms. The first is that Facebook currently does not have to disclose, in a manner for which it is responsible; what does it actually do to solve the problems? It’s a cliche at this point of what Facebook does when a new leak comes out. They apologize, they say it’s really hard and we’re working on it, but they say the same thing every time… we have to couple this with data…

“What data, if published, will show that Facebook is making progress on this file?”

Facebook releases a report titled Report on the application of Community standards which it publishes “on a quarterly basis to more effectively track our progress and demonstrate our continued commitment to making Facebook and Instagram safe and inclusive.”

Meta, Facebook’s parent company, appeared before the inquiry on Jan. 22 along with its regional policy director for Australia, New Zealand and the Pacific Islands Mia Garlick and public policy manager for Australia Josh Machine.

On the accusation that Facebook prioritizes profits over user safety, Garlick said “it’s categorically false.”

Garlick said, “Safety is at the heart of our business. If people don’t have a positive experience, if they don’t find our services useful, they won’t continue to use them. That’s why we regularly increase our investments in security.

In terms of harmful content detection, Machin said, “There has been a lot of progress in the technology’s ability to detect harmful content, but it’s not perfect. There is a long way to go. There are many instances where technology can miss content, but also instances where it may over-apply or remove material that does not violate our Community Standards because it misunderstands what it is applying. In addition to having technology that can detect as much harmful content as possible, we want technology that is also accurate and minimizes the number of calls people have to make. »

This is the latest controversy Facebook has found itself in. This morning, Andrew “Twiggy” Forrest supported a criminal complaint against Facebook for alleged misuse of his image in fraudulent cryptocurrency advertisements. Forrest also launches a civil action in the United States.

Norman D. Briggs