Whether in Ukraine or in other crisis zones around the globe, social media platforms have a duty to ensure that people have access to the free flow of life-saving information, according to a statement issued today by 31 international human rights and civil liberties organizations, including the Digital Security Lab Ukraine.
“As a global community of civil society actors, we do not demand a one-size-fits-all approach to responding to human rights crises,” the groups said in the statement. “What we are asking platforms to do is to invest more time and effort in improving their operations now, not when unfolding violence gets into the media spotlight and it is often already too late to act.”
It has become increasingly clear that platforms have followed the same playbook in Ukraine as they have elsewhere: surface-level or extractive relationships with civil society; insufficient support for local language and lack of understanding of context; and responsiveness to media pressure, not civil society pressure or human rights concerns. The Russian invasion of 2022 was a re-escalation of events that began in 2014, and platforms should have been better prepared.
The statement issued Wednesday calls upon platforms to be better prepared going forward, and urges them to address structural inequalities in how they treat different countries, markets, and regions. Specifically, the statement calls upon platforms to provide:
- Real human rights due diligence: Platforms should engage in ongoing and meaningful human rights due diligence globally, prioritizing for immediate review their operations in those countries and regions whose inhabitants are at risk of mass killings or grave human rights violations.
- Equitable investment:Platform investments in policy, safety, and integrity must be determined by the level of risk they pose to human rights, not just by the commercial value of a particular country or whether they are located in jurisdictions with enforceable regulatory powers.
- Meaningful engagement: Platforms must build meaningful relationships with civil society globally that are based not on extraction of information to improve products, but also provide civil society with meaningful opportunities to shape platform tools and policies.
- Linguistic equity in content moderation: Platforms must hire adequate numbers of content moderators and staff for every language in which they provide services. They must fully translate all of their policies into all the languages in which they operate.
- Increased transparency: Platforms should increase transparency and accountability in their content moderation practices. The Santa Clara Principles, which were updated and elaborated in 2021, provide concrete guidance for doing so.
- Clarity about so-called “Terrorist and Violent Extremist Content” (TVEC): Platforms should be fully transparent regarding any content guidelines or rules related to the classification and moderation of “terrorism” and “extremism,” including how they define TVEC, exceptions to those rules, and how the company determines when to make such exceptions. Platforms should push back against attempts by governments to use the TVEC label to silence dissent and independent reporting, and should be clear about how their TVEC policies relate to other policies such as incitement to violence.
- Multi Stakeholder Debriefs: When platforms take extraordinary actions or are forced to engage in a “surge response” to emergencies, they also must take stock afterwards to evaluate and share what they’ve learned.
“With this statement, we wanted to express solidarity with Ukrainian civil society while pushing social media platforms to do better around the world,” said Dia Kayyali, Associate Director of Advocacy for Mnemonic. “Ukraine, Yemen, India, Sri Lanka, Myanmar, Syria, Sudan—the list of places where platforms need to learn from their failures and be prepared to invest in human rights going forward is far too long. After many years of pressure from global civil society, including dozens of open statements from impacted communities, there is no longer any excuse not to be prepared. We look forward to working with platforms on implementing our demands.”
“We stand with Ukrainians and with all people in crisis zones who rely upon the free flow of information to survive,” said Jillian C. York, EFF‘s Director for International Freedom of Expression. “Social media platforms must recognize that all too often their services are misused to both spread misinformation and block from view desperately needed factual information, including evidence of war crimes and other gross human rights violations. These companies must take real steps to ensure that their policies are applied even-handedly and transparently and that their efforts continue after the immediate media spotlight moves on.”
Maksym Dvorovyi, Legal Counsel for Digital Security Lab Ukraine, said an inconsistent approach to content moderation has been a problem since Russia invaded and annexed Crimea in 2014.
“Over the years, Ukrainian users suffered from coordinated reporting of social media posts by Russians and a lack of the social media platforms’ desire to combat this problem,” he said. “Amid a non-transparent approach of assigning moderators to deal with a certain type of reported comments, misperceptions emerged and spread among the Ukrainian society about the review of Ukrainian content by Russian-speaking moderators (lacking knowledge of Ukrainian language and context), or by intermediaries’ ‘Moscow offices’ (often non-existent). Thus, at least in the mind of the Ukrainian users, platforms were biased when dealing with Ukrainian cases under the advice of their predominantly Russian staff.”
Read the full statement here: https://www.eff.org/document/letter-social-media-platforms-crisis-zones