In a letter to Facebook, the independent board of directors of the social media giant’s parent company, Meta, has urged the company to be more transparent about its content moderation decisions once more in the future.
According to two rulings issued on Thursday, the Meta Corporation’s decision to remove user posts from its platforms was overturned because the content did not violate the company’s policies, as determined by the Facebook Oversight Board. According to the board, in both cases, Meta (Facebook) should provide users with additional information about the actions that the social media platform is taking with their content.
In addition to independent experts in areas such as free expression and human rights, the Oversight Board is comprised of individuals appointed by the company to serve on the board. Affectionately referred to as “Facebook’s Supreme Court,” the board is responsible for hearing user appeals against content decisions made on Meta-owned platforms.
It has made similar calls for transparency ever since the board upheld Facebook’s suspension of Donald Trump in May. In their unanimous decision, the board agreed that Trump had violated Facebook’s policies severely, but they criticized the company for initially suspending Trump indefinitely. A clarification on how Facebook’s actual policies applied to the situation was requested by the board, which stated that “by imposing a broad, non-standard penalty and then referring this case to the Board for resolution, Facebook seeks to avoid its responsibilities.”
In response to Trump’s decision, Oversight Board co-chair Helle Thorning-Schmidt stated in May that “they cannot simply invent new, unwritten rules as they please.” It is necessary for them to conduct themselves in an open and transparent manner.”
As part of its first quarterly transparency report, the Oversight Board stated that “Facebook is not being transparent with the people who use its platforms” and urged the company to provide users with more information about content decisions. Following news reports about the program that were based on leaked internal documents, the board reprimanded Meta for withholding critical information about its “Cross-Check” program for reviewing content decisions involving high-profile users, and for failing to disclose that information. (In response, Meta stated that it had sought input from the board on Cross-Check and that it would “strive to be more transparent in our future explanations to them.”
In the most recent round of Oversight Board decisions, the issue of increased transparency was brought up once more.
Thursday’s Oversight Board rulings
Ayahuasca, a psychedelic drink long used for indigenous healing and spiritual rituals, was the subject of one of the two cases in which the board overturned Meta’s decision to remove an Instagram post after it was reviewed by the company’s automated system and a human moderator, according to the board. It was removed because it promoted the use of a non-medical substance, according to Meta, who explained why the post was removed. But the Board of Directors found that the post did not violate Instagram’s Community Guidelines at the time, which prohibited only the sale and purchase of illegal drugs. As for the company’s claim that the post was harmful to public health, it argued that it discussed the use of ayahuasca in a religious context and did not provide information on how to obtain or use the substance.
Nevertheless, the majority of the board’s criticism in this case was directed at Meta for failing to inform the user who posted the message which of its rules they had violated.
It expressed concern that the company was continuing to apply Facebook’s Community Standards to Instagram without providing users with notice, according to a statement released by the board of directors. We are baffled as to why Meta is unable to immediately update the language in Instagram’s Community Guidelines to convey this message to users,” the Board stated.
Instagram was ordered to restore the post by the company’s board of directors. When users’ content is acted upon, the board recommended that Meta “explain to them precisely what rule in the content policy they have violated.” Additionally, it urged the company to update Instagram’s Community Guidelines within 90 days in order to ensure that they are consistent with the Community Standards set forth by Facebook.
In response to the board’s decision, Facebook stated that the post had been reinstated and that it would conduct a review of similar content in the future. Additionally, it stated that it would conduct a review of the policy recommendations made by the board of directors.
As the company stated in a statement, “we are pleased with the Oversight Board’s decision,” which was made today.
The second instance involved a Facebook post from August 2021 that featured a piece of Indigenous art from North America and was intended to raise awareness about the discovery of unmarked graves at a former residential school for Indigenous children in the United Kingdom. Among the images depicted are “Theft of the Innocent,” “Evil Posing as Saviours,” “Residential School / Concentration Camp,” “Waiting for Discovery,” and “Bring Our Children Home,” according to the user who posted the artwork. The post was flagged by Facebook’s automated systems as potentially violating its hate speech policy, and it was removed the following day by a human reviewer; when the user appealed, a second human reviewer upheld the decision to remove the post.
In response to the Oversight Board’s decision to review the case, Meta determined that the removal of the post was the result of a “enforcement error” and restored it on August 27, according to a statement released by the board Thursday. In contrast, Meta did not notify the user that their post had been reinstated until more than one month had passed. This was after the Meta board inquired about the company’s messaging to the user, which Meta explained as being due to “human error.”
The board recommended that Meta “provide users with timely and accurate notice of any company action taken on the content that is the subject of their appeal.” Meta agreed with the board’s recommendation. In addition, “in cases of enforcement errors, such as this one,” the Oversight Board stated that “the notice to the user should state that the action was taken as a result of the Oversight Board’s review process.”
According to Meta, no further action would be required in light of the board’s decision in this particular case because the position had already been reinstated, and the company would review the board’s policy recommendations.
When the company responded to the board’s recommendations, the Oversight Board noted the two decisions in its second transparency report, which was also released on Thursday. The board stated that it would “monitor whether and how the company lives up to its promises” while the company responded to the board’s recommendations. Aside from that, the company announced plans to publish an annual report next year to assess the company’s progress in implementing the board’s decisions and recommendations.
‘We believe that the cumulative effect of our recommendations will eventually force Meta to improve its transparency and provide benefits to its users,’ the board of directors wrote in its report.