Facebook is deleting some users’ posts based on the color of the block letters

A Facebook Inc. document shows how the social network begins removing or restoring user posts about certain topics in response to a researcher’s request. And the rates vary by location.

The document was obtained by BuzzFeed News and shows Facebook was removing from its “content review” team content that might breach its community standards.

A Facebook spokesman described the document as “a request form about a research project and has not ever been used by Facebook.”

“Any work like this we do would never result in content being removed without a valid legal reason,” the spokesman said.

Tessa Lyons, a PhD student from University of Arizona’s cybersecurity communications doctoral program, who worked on the project with Sophia Shugart, first wrote about her research in a 2015 personal letter, BuzzFeed reported.

In it, she says: “This project sought to establish a way to identify how frequently Facebook removes content, and how frequently content is removed if removed based on factors outside of Facebook’s content guidelines. It was our goal to evaluate whether and how Facebook’s content policy affects online privacy and to create tools that would allow users to prevent content from being deleted by requesting the takedown be reinstated.”

In that letter, she says she first discussed her research at an academic conference in 2005, though she did not name Facebook, BuzzFeed reported.

Lyons told BuzzFeed that Facebook was never in charge of, “picking and choosing which posts you can have and which posts you can’t have.” She said the company would mark posts that violate Facebook’s community standards as “disallowed” and that individuals would request their posts be “revoked,” which Facebook would then then edit.

“It was done through requests like ‘titles from Texas …’ and so forth. The flagged posts would get their content flagged and go to Facebook’s content review team,” she told BuzzFeed. “The content review team would decide if or not a content was ‘allowed’ to stay on the platform or if it should be removed.”

BuzzFeed reported that those requests would lead to Facebook providing users with a box to specify the reason they believe their post should be returned.

A study Facebook published earlier this year found that the content that users didn’t want removed was frequently the same content that they didn’t want to see at all. The company said that roughly half of the time, fewer than 4 percent of users wanted to see a post that had been purged from the site.

Leave a Comment