In times of constant digitalization, mass surveillance has become a popular tool for authorities to preserve public order among individuals. One of the most often utilized surveillance tools constitutes facial recognition technology (hereinafter – FRT) – a biometric system, which uses automated methods to verify or identify a person. In this regard, Ukraine is one of the states which often resorts to the FRTs. Its need for biometric technologies became even more clear following the Russian full-scale invasion of Ukraine. In response to the emergent situation, the state turned to the use of Clearview AI. Initially developed for law enforcement purposes, Clearview’s technology matches the images against the database of publicly stored images scraped from websites, including social media platforms. Notably, despite not having unified legislation on the usage of, Ukraine still resorts to digital measures without legal framework and safeguards for data subjects. Taking into account the prolonged duration of the martial law, arbitrary usage of FRTs by authorities may negatively influence citizen’s privacy, by continuing to be applied even when the emergency ceases to exist. Therefore, it is essential to conduct a human rights due diligence assessment and find an appropriate balance between the Ukrainians’ right to privacy and the state’s national interests while the FRTs are applied in times of emergency.
1. Who Can Save Human Rights during a State of Emergency?
It is essential to comply with human rights and obligations even in exceptional circumstances, including war or other emergency situations. In this regard, Article 15 of ECHR allows the state to deviate from its obligations under the Convention, thus introducing additional restrictions on protected rights where necessary. Here authorities may suspend regular procedures and be granted more powers. Still, even given wider discretion, states’ margin of appreciation cannot be unlimited and has to remain within the “necessity and proportionality” circle. In this regard, ECHR establishes general requirements for the derogation to be considered as “valid”, such as:
- existence of war or other public emergency threatening the life of the nation;
- absence of restrictions on the non-derogable rights (Article 2, except in respect of deaths resulting from lawful acts of war, or from Articles 3, 4 (1) and 7 of ECHR);
- derogation is strictly required by the exigencies of the situation;
- notification of the Secretary General of the Council of Europe regarding the derogation.
At the same time, the practice of the European Court of Human Rights (hereinafter – ECtHR) demonstrates that individual assessment remains necessary for each case of derogation to avoid potential abuse and arbitrary interference in human rights. That is because it is still unclear how far the authorities’ discretion may extend, even if derogation allows the suspension of certain rights and regular procedures. Here the Court emphasizes that even in times of emergency human rights do not cease to exist: therefore, respective obligations remain imposed on the states. For example, in Lawless v Ireland (No. 3), concerning the introduction of special powers on detention, ECtHR justified such powers since the latter were used for the specific purpose of emergency and were subjected to a number of legal safeguards against abuse (para 38). Similarly, in Brannigan and McBride v the United Kingdom, the Court confirmed that national authorities did not overstep their margin of appreciation due to the existing legal safeguards such as the right to consult the solicitor as well as the possibility to challenge the decision by the judicial review (paras 64, 61).
Turning to the present issue with usage of FRTs, there is a concern that in time of emergency authorities may deploy even more intrusive digital technologies, reasoning them by the need to protect national interests. The derogation must be especially carefully made toward non-absolute rights, such as the right to privacy: since limitation clauses under Article 8(2) of ECHR already provide justified interference of authorities, the derogation goes even further and allows wider state’s discretion in actions. Here, however, the CJEU mirrors the ECtHR’s practice on proportionality and emphasizes that derogations concerning personal data “must apply only in so far as is strictly necessary”. Lastly, talking about war context, even if certain limitations take place, it must be guaranteed that excessive restrictions are dropped as soon as conditions of emergency no longer apply.
Ukraine and Facial Recognition Technologies
Ukraine takes an active part in the development of modern technologies. As many other countries Ukraine also resorts to surveillance measures, among which are FRTs – for both civil and military purposes. Its policy heavily relies on the digitalization, which can be seen from the mobile app “Diia” created as a database of electronic documents or programme “Safe.City” invented as an innovative way of ensuring local security. The use of digital technologies in Ukraine, however, has only increased since the full-scale Russian invasion. Thus, the following sub-chapters will assess whether Ukrainian legislation grants sufficient safeguards to data subjects who actively engage with FRTs.
2. Legislative and Policy Framework for FRTs in Ukraine
In principle, Ukraine does not have a unified legal framework related to mass surveillance measures. Specific provisions on digital technologies may be found in the separate legal provisions or bylaws. Thus, notwithstanding absence of regulation, certain laws may be analyzed.
Data Protection laws. Although Ukraine is regarded as a “digital state”, surprisingly, its legislation on privacy and data protection remains extremely outdated. The applicable Law of Ukraine “On the Protection of Personal Data” (hereinafter – Law) was introduced in 2010. Following GDPR enforcement, the Law was amended many times, albeit it still does not address the specificities of the data protection mechanism. Moreover, the current Law cannot keep up with all of advanced digital technology, which refutes the statement that legal provisions must be up-to-date. In particular, the Law does not address the non-compliance with timeframe of data processing, remedy mechanism, unauthorized data collection etc. Importantly, no supervisory body is established to conduct a meaningful oversight over the use of newly emerging technologies in the privacy dimension. While several new draft laws (such as № 5628 or № 8153) were proposed in the Ukrainian Parliament, little has been accomplished in terms of their examination and implementation.
In terms of content and privacy protection the Law is formulated mostly in general and vague provisions. Overall the Law outlines an exhaustive list of grounds for data collection and grants the data subjects essential rights to be protected from any illegal conduct or arbitrary data gathering. Apart from this, the Law obliges authorities to process data only for specific and legal purposes, granting the individual a range of rights, including access to information as well as the right to data erasure. Following GDPR, the Law prohibits processing of biometric data since the latter constitutes a special category of data. At the same time, the Law highlights exceptions when such processing deems to be lawful, among which are given prior consent and data made publicly available by the individual. The latter case creates a leeway for authorities to collect information, including involvement with the Clearview AI technology (which collects publicly available photos from social media). In this regard, no permission from the data subject is required since the Law de-facto allows data collection from the public resources. In terms of supervision powers, the Law of Ukraine “On Amendments to Some Legislative Acts of Ukraine regarding the Improvement of the Personal Data Protection System” authorised the Ombudsman of Ukraine to monitor compliance with legislation on personal data protection. Such supervisory powers, however, cannot be regarded as sufficient to enforce compliance with provisions in case a violation occurs since the Law lacks a proper sanction mechanism.
With respect to the most recent initiatives, the Draft Law № 8153 “On the Protection of Personal Data” aims to harmonize the legislation with the EU standards and fill the existing legislative gaps. Draft Law significantly details existing procedures, enhancing mechanisms of protection in case of biometric data processing and the automatic decision-making process. However, even the proposed version of law does not contain any regulation towards surveillance measures. Particularly, unlike the EU Proposal on Artificial Intelligence Act which delineates three different systems, the Draft Law does not distinguish between ordinary surveillance and FRTs, which is more intrusive. Since the Draft Law does not provide any restrictions, it basically gives authorities the “green light” in usage of any kind of FRTs without precautions. Unfortunately, said bill is unlikely to be adopted very soon, given the Parliament’s silence of half a year from the moment of its registration.
Law Enforcement laws. The Law of Ukraine “On National Police” is another law which should be mentioned in the context of surveillance. Article 40 of the Law authorises police to use “photo and video equipment, including equipment that works in automatic mode” as well as “specialized software for analytical processing of photo and video information”. No limitations whatsoever imposed on law enforcement authorities in this respect apart from the obligation to use the surveillance for strictly defined purposes. In March 2022, Ukrainian legislature amended the abovementioned Law, authorising the police to manage the register and databases which contain data about suspected criminals, accused persons, absconding defendants etc. Noticeably, such a database also possesses biometric data of individuals (including a digitized image of a person’s face) which police are obligated to collect from individuals. Concerning introduced amendments two important points should be raised.
- Firstly, the new amendments state that storage period of biometric data and other material of video-surveillance is established by the Ministry of the Internal Affairs of Ukraine. The latter authority is empowered to issue solely internal orders which do not usually have a binding nature. This, in turn, may create a potential abuse of law enforcement authorities towards the biometric data which can be stored for the indefinite period of time.
- Secondly, the amendments were introduced during martial law, where wider discretion is allowed. However, the law gives no indication as to whether it is limited to the martial law period, assuming it might well be applied in peaceful times if the law is not amended once again. In this regard, there is a danger that police will be left with excessive authority even once martial law ceases to apply.
Criminal Procedure laws. There are provisions in Criminal Procedure Code on access to information-communication systems by the investigator or prosecutor. In this regard, the latter are empowered to collect information from technical devices, such as photo or video recording, made in publicly accessible places, including in automatic mode, excluding private homes. Said provision grants especially wide discretion to the law enforcement authorities due to the following reasons:
- Collection of information is carried out based on the resolution of the investigator or prosecutor and does not require prior court order. Since the same people both approve and perform the functions, and there is no external supervisory body, then not only it gives excessive discretion but also raises the issues of potential prejudice or bias in the decision-making process.
- The provision does not specify the kind of information which may be obtained thus giving the law enforcement authorities an almost unlimited power in collecting any kinds of personal data.
3. Ukraine Brings the Artificial Intelligence to the Battlefield
In terms of practical applications of FRTs, in March 2022 Ukrainian government announced its cooperation with the US company Clearview AI. This system identifies persons by using images, which were previously scraped online from the social media platforms (such as Google, Facebook, Twitter, Vkontakte etc.). In other words, to identify the person, his/her photo has to be uploaded in a database, and the algorithm will make a match. The company’s biometric database has approximately 10 billion images in its possession and often sells the data to authorities, mostly police and agencies. In Ukraine, Clearview AI is used to identify potential Russian soldiers and infiltrators at checkpoints, Russian suspects in the commission of war crimes (to gather evidence of international crimes), identify dead soldiers and notify their families (to combat the myth that Russian invasion is a “special operation”). However, currently Clearview AI mostly identifies dead Russian soldiers (to notify their families about it) as well as the casualties of war (both among Russians and Ukrainians).
4. “Chilling” Privacy: Threats of the Clearview System
It is critical to note that Clearview AI gets particularly negative treatment on the international arena. Both human right organizations and academic scholars found the Clearview system to be an extremely intrusive technology, and not in compliance with GDPR. With regard to the former, Privacy International emphasized that usage of Clearview AI “is a considerable expansion of the realm of surveillance, with very real potential for abuse”. In this regard, Privacy International along with other regional organizations (including Digital Human Rights, Homo Digitalis etc.) filed several legal complaints against the Clearview AI company and submitted them to regulators in France, Austria, Italy, Greece and the United Kingdom. Complainants argued that Clearview AI violated numerous of GDPR provisions, namely the processing of sensitive data, lack of transparency and absence of lawful grounds for data processing. As a result of commenced domestic investigation, the French regulator imposed on Clearview the fine of 20 million EUR, ordering to stop collecting and processing data as well as delete the already gathered one. Similarly, Italy came up with an analogical decision, banning the web scraping technique and obliging Clearview to delete all the data. Such a response from the international community is more than understandable. Accordingly, the use of Clearview AI bears lots of risks for data subjects and its further applications.
Firstly, while using the Clearview, there is always a danger of complete reliance on the algorithm of the system that replaces human decision-making. While the Facial Recognition Vendor Test demonstrated the accuracy rate of 99,85% towards the Clearview algorithm, the same accuracy can never be guaranteed during future matches. Automatic decision-making remains the mere machine, thus creating the constant issue of misrecognition. In the context of Ukrainian war this entails the constant danger that the Clearview system may produce fatal errors, such as mistaking civilians for soldiers, heavily wounded combatants for dead, or even confuse Ukrainians for Russian infiltrators. Thus, law enforcement and militaries using it apparently shall abstain from relying on Clearview as a sole source of evidence.
Secondly, the Clearview technology raises the issues of the person’s privacy. Privacy is the concept that includes “both a right to control whether one’s information is shared and if so, with whom”. Thus, there is a risk that people’s expectations of privacy may have a chilling effect if they are aware that their photos might be collected and stored. Further, Clearview violates rules of GDPR and Ukrainian domestic law, especially regarding special categories of data. To reiterate, the database consists of the publicly available pictures from social media. Researchers are certain that the Clearview scrapes the photos even from private accounts (where the person does not wish to make the information public) thus presuming that there is no need for a person’s consent in the first place. However, the database contains even those images “that are no longer, but once were, publicly available”, which allows the technology to scrape even once deleted pictures. Also, there is no official way to check whether oneʼs image is in the database of Clearview AI, and thus, to request the removal of such data therefrom.
Lastly, FRTs create serious risks while being used in the Ukrainian military context. Since Clearview company decides on its own whom to offer its services, there is no guarantee the opposing party of the armed conflict will not obtain the technology at some point of time. Considering the ongoing battle and occupation of certain Ukrainian territories, there is a likelihood that the aggressor state may capture digital tools together with the physical infrastructure. Moreover, any private company may use the searchable database by Clearview provided it pays for its access. This may create a negative effect on the Ukrainian information field. In the context of the Ukrainian war, it may lead to dangerous repercussions: since Clearview AI also uses the images from the Russian social media “VKontakte”, Russia may enhance its online manipulation of web-page thus distorting results for Clearview. Finally, the very fact of effective use of the Clearview services during armed conflict implies a certain degree of legitimization of a dangerous technology, which apparently creates the risks of its further use during peacetimes or within other conflicts, where the balance of powers is not that clear (i.e. there is no distinction between an aggressor and a defending party).
All of the abovementioned concerns create an extremely serious risk with further usage of Clearview AI. Despite the fact that the company’s CEO encourages the usage only by the “trained investigator”, the latter is unreasonable if no legal grounds are provided for regulation of biometric technology. Namely, it depends purely on a good faith approach of the company and the relevant state authorities. Even more – the company basically decides who can be granted access to its services and regulates it instead of a legislator. This was the issue for the European states, and it remains the main problem for Ukraine giving the context of war. Thus, it is essential to advocate for the proper regulation of the surveillance, conducted both with military and civil purposes. It is also important to underline that any of the limitations imposed during a war context have to be lifted immediately after the state of emergency ceases to exist.
Despite abovementioned challenges it is difficult to deny the effectiveness of the launched system in Ukraine. Ukrainian authorities emphasized that Clearview assisted in the identification of 125 thousand Russian war criminals. Among them, the system identified 50 persons involved in the abduction of children from Ukraine. Because of such tangible results the cooperation between Ukraine and Clearview AI is expected to be even closer in the upcoming future: the Clearview company plans to open its local office in Ukraine for development of digital infrastructure.
5. Recommendations to Ukrainian Legislators
With the development of digital technologies and their deployment on the battlefield, it is essential to advocate for the amendment of Ukrainian legislation. If extraordinary means are deployed without appropriate safeguards, the former may very likely result in grave consequences. It is also vital to pay special attention to the special categories of data since all the technologies Ukraine uses process data through a specific technical means allowing the unique identification or authentication of a natural person. In this regard, the main recommendation this work suggests is the amendment of the current legislation by filling legislative gaps and providing people subjected to surveillance with minimum safeguards. Such safeguards have to be provided both in wartime and peacetime when the emergency situation ceases to exist. Thus, to ensure the appropriate protection and human rights as well as lawful deployment of digital tool, this work recommend the Ukrainian legislators the following:
In the peaceful time:
Amend the Law of Ukraine “On the Protection of Personal Data”:
- add an article on FRTs, which provides the mechanism of their work, an exhaustive list of grounds for its usage as well as rules for subsequent biometric data processing;
- differentiate between ordinary surveillance and AI-driven surveillance, “high-risk” and “low-risk” systems, biometric and non-biometric surveillance;
- provide the additional safeguards against the unlawful processing of special categories of data, in particular:
- establish time limits for storage of biometric data in the databases, giving the clear list of legitimate purposes (for example, suspicion in the criminal offense, ongoing criminal investigation etc.),
- grant the right to be forgotten for the data subject,
- develop a special notification mechanism after surveillance was conducted and reached its goal);
- establish the sanction mechanism for violations of data protection rules by introducing proportionate financial penalties considering the non-serious, serious and grave violations which might occur;
- empower the Ombudsman with enforcement functions (for example, the ability to impose disciplinary or administrative sanctions);
- establish an additional supervisory body for better functioning of the data protection system.
Amend the Criminal Code of Ukraine:
- criminalise the unauthorised usage of surveillance measures (especially AI-driven systems).
Amend the Law of Ukraine “On the National Police”:
- add an article on definition of mass surveillance, the mechanism of its usage and exhaustive list of purposes for which such measure can be resorted to.
Amend the Criminal Procedure Code:
- provide the court order as a prerequisite for the access to information-communication systems by the law enforcement authorities;
- specify the kind of information which may be collected from technical devices by the investigator and prosecutor.
During wartime:
Amend the Law of Ukraine “On the Protection of Personal Data”:
- provide the possibility to impose stricter sanctions for non-compliance with data protection rules (for example, by increasing the existing sum of financial penalties) with the clear indication that such policy is applicable solely during wartime;
- increase the time periods for storage of biometric data by law enforcement authorities only during wartime;
- expand the list of legitimate purposes for processing of biometric data (for example, national security or territorial integrity) with the sole applicability of the provision during wartime;
- clarify the types and scope of additional measures and technologies allowed to be applied solely and exclusively during wartime.
Amend the Law of Ukraine “On National Police”:
- specify the intrusive technologies (like automated decision-making systems, systems controlled by artificial intelligence) authorities may resort to with the indication that such digital tools may be deployed in such a manner only during wartime;
- provide the mechanism of collaboration between law enforcement and Clearview AI by listing the purposes for which the system is deployed, the functions Clearview must perform, and limitations on its usage taking into account the privacy of data subjects (for example, prevent the Clearview from scraping images from the private accounts in social media).
Anna Liudva, Tetiana Avdieieva
The analytical report was prepared as part of the program “Promoting Internet Freedom in Ukraine” implemented by the American Bar Association in Ukraine / Rule of Law Initiative.