Meta Made Millions in Ads From Networks of Fake Accounts


When Meta’s Mark Zuckerberg was called to testify before Congress in 2018, he was asked by Senator Orin Hatch how Facebook made money. Zuckerberg’s answer has since become something of a meme: “Senator, we run ads.”

Between July 2018 and April 2022, Meta made at least $30.3 million in ad revenue from networks it removed from its own platforms for engaging in coordinated inauthentic behavior (CIB), data compiled by WIRED shows. Margarita Franklin, head of security communications at Meta, confirmed to WIRED that the company does not return the ad money if a network is taken down. Franklin clarified that some of the money came from adverts that didn’t break the company’s rules, but were published by the same public relations or marketing organizations later banned for participating in CIB operations.

A report from The Wall Street Journal estimates that by the end of 2021, Meta absorbed 17 percent of the money in the global ad market and made $114 billion from advertising. At least some of the money came from ads purchased by networks that violated Meta’s policies and that the company itself has flagged and removed.

Photographs: Meta

“The advertising industry globally is estimated to be about $400 billion to $700 billion,” said Claire Atkin, cofounder of the independent watchdog Check My Ads Institute. “That is a large brush, but nobody knows how big the industry is. Nobody knows what goes on inside of it.”

But Atkin says that part of what makes information, including ads, feel legitimate on social media is the context they appear in. “Facebook, Instagram, WhatsApp, this entire network within our internet experience, is where we connect with our closest friends and family. This is a place on the internet where we share our most intimate emotions about what’s happening in our lives,” Atkin says. “It is our trusted location for connection.”

For nearly four years, Meta has released periodic reports identifying CIB networks of fake accounts and pages that aim to deceive users and, in many cases, push propaganda or disinformation in ways that are designed to look organic and change public opinion. These networks can be run by governments, independent groups, or public relations and marketing companies.

Content

This content can also be viewed on the site it originates from.

Last year, the company also began addressing what it dubbed “coordinated social harm,” where networks used real accounts as part of their information operations. Nathaniel Gleicher, head of security policy at Meta, announced the changes in a blog post, noting that “threat actors deliberately blur the lines between authentic and inauthentic activities, making more challenging enforcement across our industry.”

This change, however, demonstrates how specific the company’s criteria for CIB is, which means that Meta may not have documented some networks that used other tactics at all. Information operations can sometimes use real accounts, or be run on behalf of a political action committee or LLC, making it more difficult to categorize their behavior as “inauthentic.”

“One tactic that’s been used more frequently, at least since 2016, has been not bots, but actual people that go out and post things,” says Sarah Kay Wiley, a researcher at the Tow Center for Digital Journalism at Columbia University. “The CIB reports from Facebook, they kind of get at it, but it’s really hard to spot.”

Content

This content can also be viewed on the site it originates from.

Russia accounted for the most ads in networks that Meta identified as CIB and removed. The United States, Ukraine, and Mexico were targeted most frequently, though nearly all of the campaigns targeting Mexico were linked to domestic actors. (Meta’s public documents do not break down how much the company earns by country, only by region.)

More than $22 million of the $30.3 million was spent by just seven networks, the largest of which was a $9.5 million global campaign connected to the right-wing, anti-China media group behind the Epoch Times.



Source link

Leave a Comment

Powered by BeaconSites