Content governance in times of crisis: DSLU lawyers contributed to drafting the Declaration

In situations of armed conflicts and other crises, people use social media and messaging platforms to document human rights abuses or war crimes, access information, mobilize for action, and crowdsource humanitarian assistance. But governments and other actors leverage these same platforms to spread disinformation and hate speech, incite violence, and attack or surveil activists, journalists, and dissidents. In light of the increasingly important role social media companies play during crises, DSLU, under the lead of Access Now and together with other partner organizations have co-authored a Declaration of principles for content and platform governance in times of crisis.

This Declaration, jointly developed by Access Now, ARTICLE 19, Mnemonic, the Center for Democracy and Technology, JustPeace Labs, Digital Security Lab Ukraine, Centre for Democracy and Rule of Law (CEDEM), and the Myanmar Internet Project, sets out guidelines to help platforms protect human rights before, during, and after a crisis.

Social media companies have a responsibility to prevent and mitigate human rights harms stemming from use of their systems. Historically, however, they have responded inadequately and inconsistently, as demonstrated by the failed response to conflict situations in Ethiopia, Syria, Israel/Palestine, and Myanmar. These failures have disproportionately impacted marginalized communities and facilitated serious human rights abuses.

The Declaration is an effort to advance consistent and rights-respecting principles for companies to respond appropriately to crises and meet their obligations and responsibilities under international human rights law.

DSLU Head of Digital Rights Maksym Dvorovyi views this Declaration as an essential step forward in contextualizing online platforms’ approach to content moderation. “Since the Russian-Ukrainian war’s re-escalation this February, we witnessed several user bans and content removals based on misidentified context and lack of linguistic understanding of certain words’ use. We believe online platforms should invest more in moderation in non-English languages to properly comply with their human rights duties under the IHRL and the UN Guiding Principles on Business and Human Rights”.

Meanwhile, DSLU Legal Counsel Tetiana Avdieieva stipulates: “Activization of social media following the Russian full-scale invasion necessitated structuring of their efforts in content moderation, moving aside from chaotic reactions on emerging threats to proactive indication of dangers and development of thoughtful responses to them”.

She also stressed that this Declaration might serve as a perfect guidance for big tech in creating a human-rights-centered environment worldwide with the policies applied contextually, but non-discriminately.

Full text of the Declaration can be accessed here: full text of the Declaration.