Navigation

Biometrics in humanitarian action: a delicate balance

Technology itself is neither bad nor good, what matters is how we use it. Employing biometrics for identification purposes in humanitarian work is a use of technology that should fuel a profound debate on ethics and better funding for technical research, argues an ICRC expert. 

This content was published on October 4, 2021 - 10:00

Needs in humanitarian action often surpass available resources and there is an obligation to make sure affected populations get what they need: food, medical treatment, clean water. No less and no more. In other words, the identification and authentication of people receiving aid is important.  

In locations where humanitarian organizations operate, people must be registered: their name, age, and gender. Then, organizations must find a method so that people can later prove that they are entitled to the service or goods they request. Often, this is done with a card and a pin code or a password, but these can be forgotten or lost; they can be stolen and used by another person. Biometrics shows promise to both increase efficiency (and decrease costs) and the inclusiveness of humanitarian registration. 

Some traits are more unique than others. ‘Soft biometrics’ such as height, age, gender, and eye color are used to differentiate individuals but not to identify them since, for example, many people have the same eye color. 

‘Hard biometric’ traits like iris scans or vein patterns can identify any specific individual. It is this category, the hard traits, which are collected today in humanitarian biometrics-based registration systems.

UN World Data Forum

Data takes center stage at the UN World Data Forum in Bern (Oct. 3-6). The conference brings together data experts from the public and private sectors as well as civil society to discuss new data solutions that support the implementation of the UN Agenda 2030 for sustainable development. 

End of insertion

Depending on the degree of urgency, and whether it is a conflict zone or a natural disaster, humanitarian organizations may use ‘soft’ or ‘hard’ biometrics to identify the affected population. A thorough risk assessment will play an important role in the selection process. 

Except for some specific forensic uses where we need to get, for instance, DNA information, biometric data tend to experience ‘function creep’. One cannot restrict the collection of biometric data to a verification or identification purpose only, and biometric data can always reveal – now or in the future – more about the person than is needed for the original purpose. 

For instance, iris scans or vein network imagery are used in hospitals to diagnose patients. If a humanitarian organization uses iris scans or vein network imagery for identification purposes, there is nothing preventing them from using the same data to obtain information on the health of the people they’re identifying – even if this may go well beyond the scope of the collection. This is not only true for health-related secondary use; facial recognition reveals a lot of information about a person (e.g. their ethnicity, age range, etc.). 

Of course, biographic data can also reveal more information than intended by the original purpose; for instance, names can reveal information about ethnicity. This can also happen indirectly, when two or more sets of data are processed together to infer new information. However, biographic data is rarely sufficient to single you out from a group, while your iris is totally unique to you. 

The takeaway here is that using biometric data for verification or identification is very problematic if we want to minimize the risks of adverse usage. This is also one of the reasons that makes it so sensitive: when your biometric data is in the wild: anyone who gets their hands on it will be stealing much more than your entitlement to food or medication; they will gain the capacity to learn a lot about you, and then be able to trace you. In a humanitarian context this is extremely problematic as it potentially violates the do-no-harm principle and because it potentially jeopardizes impartiality if that data is used to ostracize or discriminate a population. 

The problem to solve is to find a method where the desirable features of biometrics can be obtained without keeping the biometric data available for secondary use. In other words, the data stored in the identification systems should not be in a form that is exploitable to deduce health conditions, ethnicity, or other problematic disclosures. 

The ideal scenario would be to identify people with systems that do not expose the biometric data, so that if data is lost or leaked it is not even recognizable as biometrics but instead looks more like ‘junk data’. While this concept may sound convoluted, it has been used for decades in every password-based authentication system, thanks to hashing functions. Hashing – a function that non-reversibly transforms a word or sentence into a unique sequence of letters and characters – allows for the storage of passwords in a way that does not reveal what the password is but still allows for it to be checked later. 

At the risk of oversimplification, we need to adopt the same system for biometrics, a recommendation more easily said than done. Passwords must be always exactly the same when we type them; if even a single character is different, it will not be accepted. This is the property that allows for password hashing: every different password generates a totally different hash, and verification succeeds only when two hashes are exactly the same. In other words, the clear text password is never used directly; only the hashes are. 

And therein lies the challenge: because biometrics are never exactly the same when collected (due to lighting, position, angle, dust and other factors), using hashes is prohibited. The match is decided via a probability threshold. For instance, if the two compared biometrics are 95% the same, then it is considered a match. With hashing techniques, the two hashed biometrics samples would be totally different and this would not lead to a 95% match anymore.  

In its Biometrics PolicyExternal link, the ICRC decided to restrict the use of biometrics when the data is stored on a device that remains in the hand of the user, like a card. It comes however at the cost of some functionality, in particular – and since the ICRC does not keep biometrics in a database – one cannot verify if a person is twice in a database. This capability, also called deduplication, is potentially important to avoid that the wrong person gets the wrong service or that some people receive too much. It is important to note that this policy is essentially driven by the lack of available solutions that would protect biometrics in all use cases. 

Research to advance this field exists but lacks resources, while the efforts for standardizing biometrics are also very much a work in progress. For instance, when you take a photo for your passport, there are rules and standards that you have to follow: no sunglasses, no face cover, position of eyes in photo, etc. In the case of biometrics, standardization to allow for system interoperability and prevent vendor lock-in (e.g. to replace an iris or fingerprint scanner by another provider) or to improve quality of the developments and tests (e.g. to fight biases and enable accountability in the performances) are also important. The revised ISO standards – from a Geneva-based organization – for protection of biometrics will be a good step in that direction and perhaps regulations at national and supranational levels like currently discussed for artificial intelligence would help too. 

Despite the challenges, new research in this domain exists, including on how to securely store biometrics in a centralized database or even on a public cloud. These technologies concentrate on finding new ways for processing biometric data so that it could work just like password hashing, so that it remains usable to properly identify individuals but does not reveal any information about them, nor makes it possible to recreate the original biometric sample from the processed data (irreversibility). 

Another key related property is revocability (or renewability). Passwords can very easily be changed when compromised, but people cannot change their original biometric traits since you cannot change an iris or fingerprint. One idea is to transform biometric data in such a way that the same finger or iris can be scanned again to create a different transformed image that will not only work for identification like the old, but that is also precisely unrelated to the old one which will be revoked. 

A different path of research focuses on reducing the uniqueness of the biometric data until it is no longer sensitive, until it does not reveal private information anymore. These methods remove parts of the biometric sample (e.g. splitting an image into many small blocks and discarding many of the blocks) or obfuscate the biometric data by distorting it or adding noise to it. These transformations are aimed at changing the data stored in the database so much that it cannot be linked to the original data. 

To fuel the debate on the use of biometric technology in the humanitarian sector we must go beyond declarations of intentions, rather it is important to have all the information available. This means, for instance, to stay mindful that even with great data protection by design processes, biometric data remains over-purposed by nature since it reveals a lot more than intended.  There is therefore a certain urgency to invest more in the technical research to protect people from function creep and to design and implement systems where biometric data is not over-purposed by design. With this in mind, we are calling for partnerships in the academic and private sectors to launch a project aimed at developing such privacy-preserving and secure systems. This collaboration is necessary now as biometrics-based identification systems are being rolled out at increasing speed.

The views expressed in this article are solely those of the author, and do not necessarily reflect the views of SWI swissinfo.ch.

Opinion series

SWI swissinfo.ch publishes op-ed articles by contributors writing on a wide range of topics – Swiss issues or those that impact Switzerland. The selection of articles presents a diversity of opinions designed to enrich the debate on the issues discussed. If you would like to submit an idea for an opinion piece, please e-mail english@swissinfo.ch

End of insertion

Comments under this article have been turned off. You can find an overview of ongoing debates with our journalists here. Please join us!

If you want to start a conversation about a topic raised in this article or want to report factual errors, email us at english@swissinfo.ch.

Share this story

Join the conversation!

With a SWI account, you have the opportunity to contribute on our website.

You can Login or register here.