Without thinking about it, we use our face to unlock our phones. Facial recognition as a means of accessing events is also increasingly common. Seems to be convenient, until you see what’s happening in China. Cameras equipped with facial recognition constantly monitor citizens. This not only allows China to distinguish shoplifters and traffic violators among crowds of people, but also individuals who criticize the regime, for example.
Such a system will probably never materialize in Europe, but where do we draw the line? Theo Breuers thinks we should ban facial recognition in public spaces. With his company People Flows, he is creating a digital data safe also based on facial recognition. You can store flight tickets, entrance tickets or medical data in this safe and you can decide yourself with whom you share them.
How it works? “It’s very simple,” laughs Breuers. He has been working on facial recognition technology since 2018. “You download the app, enter your details, scan your passport and take a selfie. These are not stored as a single unit, only as vectors to confirm it is you.
Cannot be traced back to an individual
These vectors or digital facial features are linked by a random number to an admission ticket (which you can download in the app) for a festival, for example. At the festival, visitors look at the camera and if their facial features match the ticket, they are allowed to enter. Breuers: “It is impossible to trace back to an individual. Moreover, the user is always in control. They decide themselves who can see their data.
Similarly, you can use People Flows digital vault to store other data such as diplomas, medical records or documents. Inside the vault, you can decide what you want from these documents and with whom you share it. “You are therefore in control of your own data. Why should we have all kinds of stuff scattered everywhere? With biometric data, like a face, you can store all those sensitive documents securely. »
Currently, People Flows is testing its system at Heracles Almelo Football Club and will soon launch a pilot project at two airports. Breuers is not free to say which ones. According to Breuers, so far football fans have responded enthusiastically to the pilot. “But you must carefully explain what you are doing and make it clear that there is virtually no risk of losing your data. They find the speed and ease of being able to gain admission using their face to be a major advantage. In previous tests at events, you see that around 85% of festival goers who were approached participated. We also hope to reach this figure at the Heracles Almelo stadium.
Subscribe to IO on Telegram!
Should it be banned or not?
According to him, the system also works extremely well to keep people banned from the stadium away from the front door. “People banned from the stadium cannot buy tickets and only people who have tickets can enter the stadium.”
However, it does not support the use of such systems in the public domain, for example to track down (serious) criminals. “The technology is already advanced enough for that and we’ve already shown that we can take one person out of a 25,000-person stadium. But we’re absolutely against that. We’ll never go that way either. We want that users have the choice to determine what happens to their face. We shouldn’t really want facial recognition in public spaces. It should be banned. There are other ways to organize society, Breuers points out.
Jeroen van Rest has worked for the Netherlands Organization for Applied Scientific Research (TNO) for twenty years now. He’s researching facial recognition here. According to him, a total ban would not make much sense. The technology “presumably has useful specific applications – also in public spaces”. It is, of course, already present in consumer electronics like telephones. It creates familiarity and is a way to introduce technology. Moreover, some countries are already using it. Then we usually refer to countries suspected of using it unethically, such as China or Russia. But for us in Europe it is instructive to look at England, Israel, the United States and Australia. These are countries that are perhaps not so distant from us culturally. The technology is already in use there. We must become masters of technology and understand why, when and where it can and cannot be used. »
Facial recognition in public spaces
For example, Van Rest thinks there are several use cases to consider where facial recognition in public spaces could very well be among the options. “A nuclear power plant is a private organization and its security is not a legal task for it within the meaning of the General Data Protection Regulation (GDPR). While it is implied that this kind of plant is properly protected. Protection, or an acute threat, can, in the terms defined by the GDPR, constitute a “legitimate interest” in using facial recognition. By conducting experiments on limited and specific use cases, we develop a form of collective self-confidence. What advantages does this have? What about the cons? This gives us a more realistic picture of what is and isn’t possible and preferable.
Van Rest argues that uncontrolled growth can lead to disturbing incidents. “And I will not be surprised if these incidents are significant. But developments in the field of facial recognition are moving so fast. Also, for several years now, to better prevent the unfair propagation of errors. For example, to prevent ethnic origin, age or gender from leading to disproportionately more erroneous recognitions. A ban would be tantamount to throwing the baby out with the bathwater, any knowledge gained would then be wasted.
Civil society pressure
“What if the EU says that facial recognition will be banned in public spaces? I can imagine that at some point civil society will push to use facial recognition in public spaces after all.
However, there must always be a good reason for doing so. Using facial recognition to round up a group of young people causing a nuisance in a supermarket is going too far, he says. “Furthermore, the Minister has already stated that we will never move to a generic form of live facial recognition. As such, camera surveillance alone cannot be a valid basis for using this technology.
Security and privacy
According to Van Rest, it is important to take maximum account of the privacy of the passers-by filmed when developing facial recognition. At the same time, it’s also worth exploring how the technology could still be useful to security agencies. In an experiment last year in collaboration with the Johan Cruyff Arena and the Dutch police and under the auspices of the TNO, Van Rest came up with an idea of the police: multiparty computation (MPC) as a secure shield around facial recognition. “Municipalities and the police have a mutual interest in keeping public spaces safe. However, another interest concerns privacy. Why, for example, should all images be shared with the police? »
This is where MPC comes in. Van Rest: “Imagine that you have two data streams. One from the camera footage and the other from the police watch list. With MPC, you can process these two streams together without having to share the actual data. In other words, you don’t need a “trusted” third party. Both parties only have part of the data, but together they have all the information needed to find the solution they need. »
Calculation of salary averages
“Imagine it like calculating an average salary between three or more people. Then each person divides their own salary into three random numbers which combine to add up to their salary. And then, if they swap two of those three numbers, the total information – which includes the average salary – is still in the aggregate of that data. But it’s no longer tied to an individual’s salary. This principle can also be applied to the comparison of faces. In this case, you don’t need to share images or watchlists between you. The processing is based on combined figures and not on sensitive data. This means they don’t have to share this data with each other,” says Van Rest.
The Amsterdam experience is not meant to be put into practice, notes Van Rest. “It’s an exploration of what’s possible, we want to show managers and decision-makers what’s doable. This will allow us to have a more informed discussion about security and privacy in public spaces. We want to make sure we keep asking the right questions in case someone decides to use it in public spaces.