Meta’s Oversight Board says Facebook should change the rules for wartime posts

Raw Text

Tech /

Speech /

Policy

‘When violence is itself lawful under international law, speech urging such violence presents different considerations.’

By Adi Robertson / @ thedextriarchy

Share this story

The Meta Oversight Board has overturned a Facebook moderation decision about a post comparing Russian soldiers to Nazis, saying Meta should take special care with moderation during an “unlawful military intervention.”

The semi-independent board’s decision, published today , involves a Facebook post published by a Latvian user. The post shows an image of a person killed in Bucha, Ukraine, paired with Russian text saying Russia’s army “became fascist.” (Notably, the picture does not depict violent wounds and thus would not normally trigger policies on graphic content.) The post ends with a 1940s Soviet poem including the lines “kill the fascist... Kill him! Kill him! Kill!”

Meta removed the post for violating its hate speech guidelines but restored it after a complaint to the Oversight Board, adding a warning screen for violent and graphic content.

“Neither Meta’s human rights responsibilities nor its hate speech community standard protect soldiers from claims of egregious wrongdoing”

The Oversight Board found that the poem was a rhetorical device and the comparison didn’t violate Meta’s hate speech policies as written. The post, it said, was making a historical argument comparing the behavior of Russian soldiers to Nazis at a particular point in time, not making a blanket claim that Russians were comparable to Nazis.

“Neither Meta’s human rights responsibilities nor its hate speech community standard protect soldiers from claims of egregious wrongdoing or prevent provocative comparisons between their actions and past events,” the board said. It also found that the violent content policy shouldn’t have been applied, noting that it applied vaguely to pictures of “violent” deaths — without further clarification that could help users figure out the standard.

More generally, the Oversight Board urged Meta to consider the context of a violent conflict between Russia and Ukraine — which, it notes, is widely accepted as unlawful. “The use of force as self-defense against such acts of aggression is permitted” by international agreements, the board notes. “When violence is itself lawful under international law, speech urging such violence presents different considerations that must be examined separately.”

The post was made in April 2022, around two months after Russia began its still ongoing invasion of Ukraine. Global human rights advocates, including those at the United Nations , have determined that the Russian military indiscriminately or deliberately targeted Ukrainian civilians. In March 2022, Russian forces allegedly massacred civilians in Bucha — possibly including the subject of the Facebook photograph.

While the board didn’t believe this post violated Facebook’s rules in general, it asked Meta to change its policies for wartime conflict “to take into consideration the circumstances of unlawful military intervention.” The board did not offer specific suggested language for such a change.

The board also recommends clearer public-facing rules for posts that include potentially violent imagery or language, saying Meta should allow “neutral reference to a potential outcome of an action or an advisory warning” — even if that potential outcome involves violence. In particular, the board asks Meta to explore the option of letting users decide whether they want to see warning screens for graphic content, offering the option to have them turned off by default.

The decision is part of Meta’s long-standing struggle to police content during violent conflicts. The company faced harsh criticism for allowing users to foment genocidal violence in Myanmar, but in this case, the Oversight Board is urging it to adopt a looser standard for an ongoing war — where, it notes, one side has broad international support.

Taylor Swift crashed Ticketmaster following ‘historically unprecedented demand’ for tickets

Taylor Swift crashed Ticketmaster following ‘historically unprecedented demand’ for tickets

Elon Musk says he fired engineer who corrected him on Twitter

Elon Musk says he fired engineer who corrected him on Twitter

Elon Musk demands Twitter employees commit to ‘extremely hardcore’ culture or leave

Elon Musk demands Twitter employees commit to ‘extremely hardcore’ culture or leave

Elon Musk ignored Twitter’s internal warnings about his paid verification scheme

Elon Musk ignored Twitter’s internal warnings about his paid verification scheme

Microsoft’s Xbox chief settles the Call of Duty PlayStation debate once and for all

Microsoft’s Xbox chief settles the Call of Duty PlayStation debate once and for all

Verge Deals

/ Sign up for Verge Deals to get deals on products we've tested sent to your inbox daily.

Single Line Text

Tech / Speech / Policy. ‘When violence is itself lawful under international law, speech urging such violence presents different considerations.’ By Adi Robertson / @ thedextriarchy. Share this story. The Meta Oversight Board has overturned a Facebook moderation decision about a post comparing Russian soldiers to Nazis, saying Meta should take special care with moderation during an “unlawful military intervention.” The semi-independent board’s decision, published today , involves a Facebook post published by a Latvian user. The post shows an image of a person killed in Bucha, Ukraine, paired with Russian text saying Russia’s army “became fascist.” (Notably, the picture does not depict violent wounds and thus would not normally trigger policies on graphic content.) The post ends with a 1940s Soviet poem including the lines “kill the fascist... Kill him! Kill him! Kill!” Meta removed the post for violating its hate speech guidelines but restored it after a complaint to the Oversight Board, adding a warning screen for violent and graphic content. “Neither Meta’s human rights responsibilities nor its hate speech community standard protect soldiers from claims of egregious wrongdoing” The Oversight Board found that the poem was a rhetorical device and the comparison didn’t violate Meta’s hate speech policies as written. The post, it said, was making a historical argument comparing the behavior of Russian soldiers to Nazis at a particular point in time, not making a blanket claim that Russians were comparable to Nazis. “Neither Meta’s human rights responsibilities nor its hate speech community standard protect soldiers from claims of egregious wrongdoing or prevent provocative comparisons between their actions and past events,” the board said. It also found that the violent content policy shouldn’t have been applied, noting that it applied vaguely to pictures of “violent” deaths — without further clarification that could help users figure out the standard. More generally, the Oversight Board urged Meta to consider the context of a violent conflict between Russia and Ukraine — which, it notes, is widely accepted as unlawful. “The use of force as self-defense against such acts of aggression is permitted” by international agreements, the board notes. “When violence is itself lawful under international law, speech urging such violence presents different considerations that must be examined separately.” The post was made in April 2022, around two months after Russia began its still ongoing invasion of Ukraine. Global human rights advocates, including those at the United Nations , have determined that the Russian military indiscriminately or deliberately targeted Ukrainian civilians. In March 2022, Russian forces allegedly massacred civilians in Bucha — possibly including the subject of the Facebook photograph. While the board didn’t believe this post violated Facebook’s rules in general, it asked Meta to change its policies for wartime conflict “to take into consideration the circumstances of unlawful military intervention.” The board did not offer specific suggested language for such a change. The board also recommends clearer public-facing rules for posts that include potentially violent imagery or language, saying Meta should allow “neutral reference to a potential outcome of an action or an advisory warning” — even if that potential outcome involves violence. In particular, the board asks Meta to explore the option of letting users decide whether they want to see warning screens for graphic content, offering the option to have them turned off by default. The decision is part of Meta’s long-standing struggle to police content during violent conflicts. The company faced harsh criticism for allowing users to foment genocidal violence in Myanmar, but in this case, the Oversight Board is urging it to adopt a looser standard for an ongoing war — where, it notes, one side has broad international support. Taylor Swift crashed Ticketmaster following ‘historically unprecedented demand’ for tickets. Taylor Swift crashed Ticketmaster following ‘historically unprecedented demand’ for tickets. Elon Musk says he fired engineer who corrected him on Twitter. Elon Musk says he fired engineer who corrected him on Twitter. Elon Musk demands Twitter employees commit to ‘extremely hardcore’ culture or leave. Elon Musk demands Twitter employees commit to ‘extremely hardcore’ culture or leave. Elon Musk ignored Twitter’s internal warnings about his paid verification scheme. Elon Musk ignored Twitter’s internal warnings about his paid verification scheme. Microsoft’s Xbox chief settles the Call of Duty PlayStation debate once and for all. Microsoft’s Xbox chief settles the Call of Duty PlayStation debate once and for all. Verge Deals. / Sign up for Verge Deals to get deals on products we've tested sent to your inbox daily.