Refugee protection and digital legal aid in the Middle East
This two-part online workshop considered how humanitarian technology-use can interact with refugee protection, specifically in the Middle East.
In February and March 2021, I organised a two-part workshop in which academics, activists, lawyers and NGO-workers were invited to (re)think how digital technologies interact with refugee protection, specifically in the Middle East.
Refugee protection – the right to be protected from persecution and the right to make claims to these rights in another country – is increasingly data-driven protection. The increased pluralisation and privatisation of migration management interact with widespread experimental deployment of humanitarian technology. In regard to border and migration governance, governments and UN agencies are developing emerging digital technologies in ways that are
Discussions of digital rights of refugees are key, because getting their privacy wrong can have disastrous consequences. Digital technologies also interact with refugee law, for instance by reconstituting what counts as legal knowledge. And the same technologies – biometric information and automated technologies – are also increasingly used for pre-emptive border controls further narrowing the right to seek refuge and future rights of refugees. Here I consider some important concerns and potential directions for doing differently, derived from the workshop, before I make a case for digital refugee lawyering.
Limited regulations combined with a dwindling of funding and the push for efficiency and ‘objectivity’ by external stakeholders has contributed to experimental technology-use, such as the use of iris-scanning technologies, automated vulnerability assessments and cash-assistance via block-chain technology. Humanitarian operations in Jordan and Lebanon are known for innovation and datafication of relief. Geographical areas that receive less humanitarian and academic attention are perhaps also prime locations for technological experimentation.
Recently, there has been more attention for data protection in humanitarian settings. International organisations have developed their own data protection policies. But matters such as limited information provided to data subjects, widespread (meta)data sharing and the permanence of data are persisting as is the presumption that a digital identity would result into a legal identity. Concerns about the use of data beyond its original purposes, cyber (in)security, and algorithm’s tendencies for entrenching structural inequalities also remain.
The increased usage of ‘new’ technologies can cloud that technologies have long been used in refugee management and already often simultaneously imposed control. For instance, physical copies of UNHCR’s Refugee Status Determination handbook were never made accessible, for concern that refugees would use them to ‘game’ the system. Current emphasis on data extraction and biometrics closely resemble colonial governance and its racialised exceptionalism. And some refugee communities have longstanding histories of being experimented on.
What is new is the persistence of data, their accessibility over distance and the ability to continuously reassemble data. Technologies can enable urban refugee settings to become camp-like environments by installing modes of surveillance and control. Digital transformations are not confined to refugee governance. But experimentations in humanitarian settings often provide normative and scientific affirmation for technological-driven measures and relate to larger macro-political developments, including anti-migration tendencies and bio-tracing efforts to control Covid-19.
The involvement of private sector and big tech often creates opacity. Across the board, there is need for greater techno-legal consciousness and more knowledge on the back-end of technological infrastructure, on how data can be (mis)used, exploited and misappropriated and how the activities of private partners – including but going beyond Palantir, IrisGuard, Accenture – oscillate between border control and humanitarian operations. Such private partnerships raise questions about normative frameworks used within UN organisations. Committed humanitarian operations might be dedicated to not sharing data, but it is questionable whether involved third parties will uphold the same standards.
This not an argument for more handbooks, for there is often a gap between guidelines produced in Brussels or Geneva and actual data practices by humanitarian workers and this can easily result in more work pressure in the ‘field’. Persisting hierarchical work cultures, fear that admitting mistakes would result in loss (jobs, funding) and the need to tell success stories continue to make learning from the past difficult.
Academics, activists, affected populations, the tech community, practitioners, and policymakers ought to join their efforts. This includes being mindful to the politics of translation, language and accessibility to knowledge. Concerned populations are actively involved in negotiating safety, also concerning their data use, but meaningful consent and access to necessary information. From the outset people on the move, trusted local researchers and communities already working on these topics ought to be involved in discussions on digital rights spaces. In the tech community emphasis is often put on removing biases whereas in refugee law, personal information and characteristics are crucial to determine the credibility of a claim. Such and other differences need to be recognised and addressed.
Implementing partners, headquartered in the EEA, are since 2018 required to follow the General Data Protection Regulation (GDPR). GDPR also applies to personal data collected from people beyond Europe. It does not apply to International Organisations. Workshop participants noted that the GDPR did not result in substantive changes in how data is collected, stored, and processed: other NGOs, not bound by GDPR, would be asked to do the work. Many countries across the globe have their own national frameworks for data protection, but these are not always enforced as GDPR would be. Data protection policies can also be (mis)used for government control.
Donors tend to push for efficiency and a logic of audit but have rather minimal requirement for data protection and technology-oriented programs. And claims about the functionality of technologies in humanitarian relief are hardly ever questioned or evaluated. It is therefore noteworthy that in April 2021 a European parliament member asked why the EU, by funding the WFP and UNHCR’s biometric identity systems for refugee registration in Jordan, was approving standards that within the EU would be deemed ethically unacceptable. This question will hopefully be taken forward.
Discussions on rights easily turn into discursive dances around responsibilities and sovereignty that do not relate to realities on the ground. The concept of digital refugee lawyering I put forward therefore perceives digital rights as a negotiated practice. It not only considers how technologies interact with the already precarious access to rights that is reality to many forced and illegalised migrants worldwide. It also explores how to ensure that – considering legal marginalisation in interaction with (lack of) rule of law – people seeking protection and persons working to aid their access to rights can draw safely upon the potentials of digital connectivity. How technologies operate and interact with social relationships relates to matters such as access, power, and privilege. There is potential that procedures taken to curtail Covid-19 can aggravate risks of refugees. And much of UNHCR’s processing procedures are now done remotely. Legal aid is following this development. This only makes discussions on how to act collectively and locally in favor of digital rights of refugees and other (illegalised) migrants more pertinent.