On 9 August 2022, DSLU submitted its public comments in response to Meta Oversight Board call on the Russian poem case (2022-008-FB-UA). These comments will aid the Oversight Board’s deliberation on the case and possible updates of Meta’s policies on hate speech and graphic content, and will be published together with its decision.
The case concerns the post of a Latvian user who posted a photo and text in Russian to their newsfeed. The photo shows a street view with a person lying still, likely deceased, on the ground, with no wounds visible. In the text, the user commented on alleged crimes committed by Soviet soldiers in Germany during the Second World War. The user draws a connection between the Second World War and the invasion of Ukraine, arguing that the Russian army “became fascist.” They write that the Russian army in Ukraine “rape[s] girls, wound[s] their fathers, torture[s] and kill[s] peaceful people.”The user concludes that “after Bucha, Ukrainians will also want to repeat … and will be able to repeat” such actions. At the end of their post, the user shares excerpts of the poem “Kill him!” by Soviet poet Konstantin Simonov, including the lines: “kill the fascist so he will lie on the ground’s backbone, not you”; “kill at least one of them as soon as you can”; “Kill him! Kill him! Kill!”.
The post was viewed approximately 20,000 times. The same day the content was posted, another user reported it as “violent and graphic content.” Based on a human reviewer decision, Meta removed the content for violating its Hate Speech Community Standard. Hours later, the user who posted the content appealed and a second reviewer assessed the content as violating the hate speech policy. The user appealed to the Oversight Board. As a result of the Board selecting the appeal for review on May 31, 2022, Meta determined that its previous decision to remove the content was in error and restored it. On June 24, 2022, 24 days after the content was restored, Meta applied a warning screen to the photo in the post under the Violent and Graphic Content Community Standard, on the basis that it shows the violent death of a person.
In light of our experience of communicating with the social media platforms during the ongoing Russian-Ukrainian war and accounting for the international human rights standards, DSLU recommends Meta:
- To amend its hate speech and incitement to violence policies with a view to prioritize the analysis of intent of the user and the likelihood of violence occuring for qualification of statements under these policies in the times of armed conflict;
- To establish an international crimes exception from its violent and graphic content policies to preserve the content bearing potential evidentiary value for international tribunals;
- To provide more protection to artistic speech on the platform;
- To create a dedicated team on content moderation issues for each significant international armed conflict.
Read our full submission below
Digital Security Lab Ukraine (hereinafter – DSLU) is a digital rights organisation in Ukraine. We have additionally contributed as trusted flaggers for major online platforms since February. Our experience throughout the last five months evidences that online platforms were ill-equipped to answer the pressing issues of moderation, including the contextual understanding of hate speech, the application of violent and graphic content policies, and the alleged coordinated behaviour from the users representing the warring parties.
Under the UN Guiding Principles on Business and Human Rights, the responsibility of business enterprises to respect human rights refers to internationally recognized human rights, including freedom of expression. Departing from this principle, DSLU will answer the following questions relying on international freedom of expression standards:
- How Meta’s policies on hate speech, incitements to violence, and graphic content should be tailored to the international armed conflicts;
- How artistic speech should be balanced with hate speech;
- How Meta should treat content moderation during an international armed conflict.
As to the first question, DSLU acknowledges that international armed conflicts are marked by the rise of harmful content posted online. It is natural for users who spend a significant amount of their spare time to reflect emotionally on current events, especially when their state is under attack. Such statements may infrequently involve calls for hostility, violence, and discrimination. Other users, such as media, NGOs and other watchdogs, will use social media for advocacy of their causes, be it international crimes reporting, criticism of governments of opposing parties, calls for international aid for one of the belligerent parties or calls for peace at all costs. This may involve depictions of atrocities, such as decomposed bodies and dismembered parts.
It is undoubtful that such content is harmful to the general public and, more specifically, to minors: two categories of content widely recognized as “clearly unlawful”, in the words of the ECtHR in Delfi AS v Estonia and subsequent case-law. It is also clear that any calls for international crimes, such as genocide, war crimes, and crimes against humanity shall be prohibited notwithstanding the context. Further analysis shall be conducted on other types of hateful statements.
Under the UN Rabat Plan of Action, six factors should be taken into account when analyzing the capacity of a hateful statement to lead to harmful consequences, and, thus, be prohibited. Two factors which gain significance during the international armed conflict are the intent of the speaker and the likelihood of harm, including imminence. In a heated social context, some words belonging to a low register of style shall be treated as emotional disapproval or rejection of the ongoing situation and has to be given less weight (see Savva Terentyev v Russia).
A similar view of the intent’s importance was supported by the ECtHR in Kilin v Russia even outside the war context. The Court outlined that such intent is key and might be established where there is an unambiguous call by the person using hate speech for others to commit the relevant acts or it might be inferred from the strength of the language used and the previous conduct of the speaker. We conform to this view and call on Meta to adopt it when deciding on future hate speech cases related to international armed conflicts or modifying its policies.
In the context of the ongoing Russian-Ukrainian war, DSLU witnessed the removal from the platforms of posts depicting mass killings in Bucha, Irpin and other cities in the vicinity of Kyiv. Instagram has even blocked several hashtags (such as #buchamassacre) for a limited amount of time, presumably since it linked to such pictures. By doing so, it restricted users’ access to public interest content. In our view, also supported by the Ukrainian civil society statement from 12 April 2022, content depicting violence or nudity may gain public interest during international armed conflicts and shall thus be preserved on the platforms. For instance, it may bear potential evidentiary value for legal tribunals dealing with international crimes. Thus, we call on Meta to design a carve-out from its rules aimed at the preservation of such content.
As to the second question, DSLU argues that artistic speech deserves additional protection under international law. This is the principle which has to be preserved but has to be carefully balanced with hate speech. In the context of the Russian-Ukrainian war, since 2014 Meta tended to treat this content overcautiously, by deleting caricatures depicting Putin and other Russian politicians under hate speech rules and banning users. Guidance to establish this balance may be extracted from the ECtHR decision in M’Bala M’Bala v France. There, the Court that satire and use of artistic expression to criticise reality is permissible insofar it does not amount to the abuse of conventional values, such as denying crimes against humanity and other international crimes, promoting Islamophobia and Antisemitism. A mere insult of the person or even a group of persons by the social media post shall not suffice for the respective policy violation and may amount to the disproportionate restriction of users’ speech. Thus, we call on Meta to provide clarity to its policies to provide artistic expression with the necessary protection granted by international law.
Finally, we would suggest Meta create a dedicated team on content moderation issues for each significant international armed conflict. It should consist of people with a proper contextual understanding of the situation and serve as a unified point of contact for local trusted partners. It should also be prepared to respond fast to the incidents, such as bans of popular users and journalists, who frequently fall prey to coordinated attacks.