This tweet generated over 200 replies … with a real split between those who will NEVER fly again and those who are either resigned to the intrinsic use of AFR technologies for travel or think the security afforded by the technologies is a good thing.
AFR technologies have been used in Canadian airports since 2017. Our EU biometric passports utilise AFR technology. These e-passports have a chip in them with the holder’s facial biometric. And, as I said in a previous post, research in to AFR began in the 1960s with the work of Woody Bledsoe, Helen Chan Wolf, and Charles Bisson.
Despite the seemingly unstoppable rollout of AFR for border security, what can be questioned is its use in public life. Should we have AFR in the high street or in hospitals? Should it be used at football games or at music concerts? It is already happening in some of these places. That is why Ed Bridges, represented by Liberty, is taking South Wales Police to court for their use of AFR in public spaces. ‘South Wales Police has used facial recognition in public spaces on at least 22 occasions since May 2017. Ed believes his face was scanned by South Wales Police at both a peaceful anti-arms protest and while doing his Christmas shopping.’ (Liberty website accessed 25-04-19).
Ed Bridges says:
“Without warning the police have used this invasive technology on peaceful protesters and thousands of people going about their daily business, providing no explanation of how it works and no opportunity for us to consent. The police’s indiscriminate use of facial recognition technology on our streets makes our privacy rights worthless and will force us all to alter our behaviour – it needs to be challenged and it needs to stop.”
The key here is that the indiscriminate use of AFR, without regulation or limits, will force us all to alter our behaviour. It thwarts our right to autonomous action and therefore, limits our individual and collective potential to envision and create a better future.
Postscript: That ‘final’ sentence clearly indicates my inherent bias and so in the interests of balance this is a link from digital security company Gemalto with an article published this month on the current trends in AFR (its a good read and worth going to) https://www.gemalto.com/govt/biometrics/facial-recognition.
BTW: many thanks to Dr Ian Cook who forwarded me the tweet. It provided many hours of onward links.
I spent Tuesday morning making my face into its own dot-to-dot drawing.
Using this diagram as a guide (taken from the Wonderworks Museum information panel), I drew dots on my face that align with its form and structure. Places such as: REyebrowEnd, REyebrowMid, NoseBridge, LOrbitalUpper, LEar, LOrbitalLower, RJawEnd, RMidForehead… and so on. And, then I drew lines between the dots.
Despite feeling like I was getting ready for an off the wall Halloween party, this was a useful exercise.
The measuring and scanning and recording of our faces is an intimate activity. By spending about 30 minutes firstly drawing dots and then joining them together I spent more time looking at myself than in the past 10 years. (It is little embarrassing. But undoubtedly funny too.)
This dot-to-dot exercise emphasised to me what it means to have your face captured and scrutinised, and brought my head-space rational thinking into a bodily-felt emotion-inducing space.
My initial research into automatic facial recognition (AFR) has thrown up a long list of links. This blog is going to be really useful in helping me to select the ones I have the most to learn from.
There is a mass of information online partly because the research in AFR began in the 1960s, and partly because governments appear very enthusiastic to support its development. The main aim is to make AFR more efficient and effective – to bring its success rate up to 100%.
I had no idea there were so many different forms of AFR. Basically the software needs to measure the face and then compare these measurements to a database of faces. There are many ways to capture a face, such as: plotting points and making measurements, using infrared light, using 3D scanning, analysing skin texture.
The databases themselves throw up a multitude of problems. The early databases consist almost entirely of white males (reflecting, I guess, the associates of the computer science researchers), with later databases featuring women and people who aren’t white caucasian. See this article in the Guardian.
https://www.theguardian.com/technology/2017/dec/04/racist-facial-recognition-white-coders-black-people-police
This link below lists databases from universities that are are available for use. http://www.face-rec.org/databases/ – a fascinating insight into the mechanics of AFR research – I also find myself questioning the ethics of how these databases are compiled in the first instance. One dataset consists of women (and presumably young women and girls) who are there doing YouTube make up tutorials. Once online they have no consent with where their image goes or how it is used.