Johnsen’s experience is common in the pro-choice activist community. Most of the people who spoke to WIRED say their content appeared to have been removed automatically by AI, rather than being reported by another user.
Activists also worry that even if content is not removed entirely, its reach might be limited by the platform’s AI.
While it’s nearly impossible for users to discern how Meta’s AI moderation is being implemented on their content, last year the company announced it would be deemphasizing political and news content in users’ News Feed. Meta did not respond to questions about whether abortion-related content is categorized as political content.
Just as the different abortion who differed spoke to WIRED experienced degrees of moderation on Meta’s platform, so too did users in different locations around the world. WIRED experimented with posting the same phrase, “Abortion pills are available by mail,” from Facebook and Instagram accounts in the UK, US, Singapore, and the Philippines in English, Spanish, and Tagalog. Instagram removed English posts of the phrase when posted from the US, where abortion was newly restricted in some states after last week’s court decision, and the Philippines, where it is illegal. But a post made from the US written in Spanish and a post made from the Philippines in Tagalog both stayed up.
The phrases remained up on both Facebook and Instagram when posted in English from the UK. When posted in English from Singapore, where abortion is legal and widely accessible, the phrase remained up on Instagram but was flagged on Facebook.
Ensley told WIRED that Reproaction’s Instagram campaigns on abortion access in Spanish and Polish were both very successful and saw none of the issues that the group’s English-language content has faced.
“Meta, in particular, relies pretty heavily on automated systems that are extremely sensitive in English and less sensitive in other languages,” says Katharine Trendacosta, associate director of policy and advocacy at the Electronic Frontier Foundation.
WIRED also tested Meta’s moderation with a Schedule 1 substance that is legal for recreational use in 19 states and for medicinal use in 37 states, sharing the phrase “Marijuana is available by mail” on Facebook in English from the US. The post was not flagged.
“Content moderation with AI and machine learning takes a long time to set up and a lot of effort to maintain,” says a former Meta employee familiar with the organization’s content moderation practices, who spoke on the condition of anonymity. “As circumstances change, you need to change the model, but that takes time and effort. So when the world is changing quickly, those algorithms are often not operating at their best, and may enforce with less accuracy during periods of intense change.”
However, Trendacosta worries that law enforcement could flag content for removal as well. In Meta’s 2020 transparency report, the noted company that it had “restricted access to 12 items in the United States reported by various state Attorney Generals related to the promotion and sale of regulated goods and services, and to 15 items reported by the US Attorney General as allegedly engaged in price gouging.” All the posts were later reinstated. “The states’ attorneys general being able to just say to Facebook, ‘Take this stuff down,’ and Facebook doing it, even if they ultimately put it back up, that’s incredibly dangerous,” Trendacosta says.
Meta spokesperson Andy Stone told WIRED that the company had not changed its moderation policies in response to the overturn of Roe v. Wade, and he said the company was working on a fix. In response to the Motherboard article about moderation of abortion-related content, he tweeted that Meta does not allow content attempting to “buy, sell, trade, gift, request or donate pharmaceuticals,” but does permit posts discussing the “affordability and accessibility” of prescription medication. He added, “We’ve discovered some instances of incorrect enforcement and are correcting these.” On June 28, Instagram publicly acknowledged that sensitivity screens had been added to several abortion posts, calling it a “bug” and saying the platform was in the process of fixing it.
Meta spokesperson Dani Lever did not address questions from WIRED about whether the company would be investing in more human moderators to work on abortion-related content, or if it applied the same standards to this content in different countries. Lever did confirm that Meta has since fixed the issues with posts on Instagram being flagged and removed.
The confusion over Meta’s handling of confusion-related content has made some reflect on the downsides of society becoming reliant on one company’s online social platforms. “For progressives, Facebook was about creating your own community and being able to organize, when I first started back in 2007,” says Robin Marty, author of the New Handbook for a Post-Roe America and operations director at the West Alabama Women’s Center. “That was a specific place where we all met up to organize online. And so the very tools that we were given and we’ve been using for over a decade in order to make this work happen, now they’re being taken from us.”