Contingency planning in the Digital Age: Biometric data of Afghans must be reconsidered
This blog examines the security implications for Afghans who have had their biometrics registered by humanitarian or military agencies.
Humanitarian accountability – including in the domain of tech experimentation – remains widely contested. Earlier research points to various risks and harms as well as unintended consequences of humanitarian technology, or, as it is euphemistically portrayed, humanitarian innovations. Cases become particularly problematic where vulnerable human beings are reduced to digital bodies, economic value is generated from refugee data, or where technology produces or strengthens a social-political order that is reminiscent of colonial power structures. This blog tackles a related issue that has received much less focus, namely the use and impact of tech solutions used for the purpose of screening by NGOs in the context of counterterrorism (see also earlier contributions from Ben Hayes and Emanuela-Chiara Gillard).
Over the past two decades, since 9/11 2001, the relationship between aid work and (counter)terrorism has become increasingly complex, engendering a broad securitisation of the NGO sector. NGOs in the Global South can be targeted by terrorists, used for terrorism-related purposes, and required to participate in counter-terrorism activities both in the digital and analogue spaces. According to the Financial Action Task Force, an intergovernmental body, some NGOs in the sector “continue to be misused and exploited by terrorists through a variety of means”, for example transferring money for illegal purposes (see FATF Recommendation 8). “As a consequence, screening of certain individuals, potential suppliers, contractors, employees, private donors and, in some settings, even beneficiaries, has emerged as an important activity for NGOs to minimise the risks of contributing to financing terrorism on the one hand and to ensure legal compliance and transparent operation on the other hand. At the same time, these screening practices create ambiguity by potentially transforming larger NGOs into surveillance actors and creating ethical dilemmas for actors within the sector that operate in conflict zones, manage numerous contracts and receive funding from official donors.
By signing the donor contract, an NGO becomes responsible not only for the aid project it implements, but also for complying with further legal conditions in the context of counterterrorism (see the Norwegian Refugee Council’s report Principles under Pressure: The impact of counterterrorism measures and preventing/countering violent extremism on principled humanitarian action). Conditional clauses in grant agreements, for example, aim to ensure risk-minimalisation by preventing money-laundering and terrorism funding in the context of aid work. What prevention means, however, is not necessarily specified in grant agreements, rather it is left to the market which offers various tech solutions for stakeholders. In this blog, I map out the politics of screening before identifying a set of emergent problems.
Screening means testing or examining of something or somebody, usually, for detecting a disease or fault. While this practice has been widely used and discussed in medical contexts for the obvious ethical implications (what is the point of screening if there is no treatment? does screening serve public health concerns or is it about individual patients’ rights?), various screening techniques are also applied in the military/security contexts and in the business sector. Screening can therefore be considered an investigation, “which involves obtaining all relevant available data about a person’s past education, employment, and personal behaviour and making judgments concerning the individual’s likely future loyalty and honesty.” Common techniques include the use of data banks, polygraphs or lie-detectors, pencil-and-paper psychological tests or stress interviews. With regards to counterterrorism, the use of commercial databases offering accesses to integrated and consolidated sanctions can be added to this toolkit.
Screening, if done manually, requires a lot of work. Therefore, the process has been automatised by commercial actors having seen a business opportunity in legal-regulatory compliance. While such tech solutions were originally developed for banks and financial institutions, larger, international NGOs reportedly acknowledge subscription to screening solutions (NRC 2018, 24; VOICE 2021, 13). Results of the online survey conducted for the sake of my research also reveal that 60% of the respondents (NGOs) collect personal data for the purpose of background checks. Answering the question “[w]hat are the main purposes of collecting and processing personal data of local (non-European) data subjects at your organisation (as data controller) in the context of project implementation?”, 60% of the respondents (21 of 35 NGOs) selected the option “contractual obligations required by donors (background check, vetting or screening of individuals expected by our donors)”
Popular tech solutions available on the market are FinScan, LexisNexis WorldCompliance, CSI WatchDOG Elite, Bridger Insight Online, and Visual Compliance System (VOICE 2021, 13). These systems integrate several hundred sanctions and enforcement lists that are otherwise publicly available on official websites. The main benefit of the use of such tech solutions is simplifying legal compliance for preventing financial crime, ensuring compliance with anti-bribery requirements, as well as demonstrating accountability and efficiency. Reviewing hundreds of official lists is time-consuming especially in case of larger NGOs. By integrating sanctions lists available on various official websites, the commercial solutions provide customised access to comprehensive financial crime and sanctions data, law enforcement lists and even to profiles of politically exposed persons.
The operation of Western and international NGOs in the Global South often attracts criticism with regards to conduct and consequences. Screening is no exception. For example, the six Palestinian NGOs that were designated as ‘terrorist organisations’ by Israel appear in these systems as an ‘alert’, as these systems are regularly updated following changes in relevant datasets, among others, via the Israeli government’s website. Such hits require international NGOs to make funding decisions depending on their contracts and negotiations with governmental donors.
The use of screening technologies is also problematic from a human rights perspective. First, the legal-regulatory environment is much weaker in most aid recipient countries than in those which hosted the design and development of technologies purchased by aid actors. Second, while the use of technology usually entails the mass collection of personal data, local data protection regimes are typically not only weak, but they are usually built on the epistemological foundations of Western privacy notions (see Arora 2019).
What is wrong with screening, if the purpose of the tech solutions is to prevent terrorism in a manner that is efficient from an organisational perspective? Beyond the politics of listing, the dilemma of screening lies, in part, in secrecy and concealment, the extent to which this activity, which could not be performed without the personal data of the given individuals, is unknown to the data subjects themselves. While personal data is collected for various, legitimate reasons – signing a supplier contract, renting a location, a labour contract or accepting donations – the data subjects are rarely aware of the fact that their personal data may also be checked against a database containing sanctions and law enforcement lists.
Recalling ethical principles of medical screening developed by the WHO (1968), the rule of thumb requires that “there should be sufficient direct evidence from well-conducted studies that early detection improves … outcomes, and that the benefits of screening outweigh any potential harms.” Furthermore, the condition should be an important [health] problem. Terrorism is an important problem, indeed, but the politics of listing is not free from controversies. With regards to the principle that ‘there should be an accepted treatment for patients with recognised disease’ one may consider terrorism a societal disease, but it is unclear what NGOs do with positive hits beyond exclusion from aid projects. Last but not least, screening should be acceptable to the population concerned. Meeting this principle is highly unlikely considering that not only data subjects screened, but even country officers and advisors working at NGOs are unfamiliar with this practice (that is usually conducted by the legal or HR department).
Acknowledging that the essence of the screening process is background checks based on personal data (while subscribers of screening tech need basic personal information to run any search in the database, the search yields a file containing further personal data in case of positive hits), data protection laws should also be considered. With regards to the EU – and NGOs registered in the EU – the transparency principle takes the form of a duty to inform data subjects, which is enshrined in various articles of the GDPR (Article 12-14; Recitals 11, 58, 59, 60, 63, 166). The right to access to information coupled with the principles of fair and transparent processing “require that the data subject be informed of the existence of the processing operation and its purpose … taking into account the specific circumstances and context in which the personal data are processed” (EU GDPR Recital 60). Yet, publicly available privacy notices rarely contain reference to screening. Among the NGOs whose privacy notices were analysed for their content, only one of PLAN UK’s privacy notices can be mentioned as a rare exemption:
“Ethical screening and minimising risk: as a registered charity, we are subject to a number of legal and regulatory obligations and standards … this means that we may carry out background checks and appropriate due diligence on donors and potential donors or check donations to help protect the charity from abuse, fraud and/or money laundering and/or terrorist financing.”
However, even PLAN’s general privacy notice fails to mention that not only individual donors, but also potential local suppliers, partners, employers or beneficiaries can be screened.
To sum up, to avoid being deemed unaccountable or accused of financing terrorism even if unintentionally, larger NGOs resort to tech solutions to comply with legal requirements, especially those whose scope of activities concern multiple countries and sectors in the Global South, various donors (EU, US, national), numerous projects, contracts and financial transactions. The use of such tech solutions, however, require the collection of personal data and is not unrelated to the right to and notion of privacy – the protection of which is considered a human rights issue in the information age. Screening is an ambiguous practice not only because the politics of listing is controversial, but because NGOs feel uneasy about participating in activities (anti-money laundering or counter finance terrorism), the purpose of which looks legitimate from given perspectives, but where universal definitions are missing (who is a terrorist?). Participation in counter-terrorism is not without dilemmas: it may be in conflict either with principled humanitarian actions, or with human rights in general – or both.