Facial recognition

The good, the bad and the ugly

By Jen Stern - 14 Nov 2021

Advertisement

6 min read

Whether it’s unlocking your phone or accessing your banking app, facial recognition technology is everywhere. It’s a tool that is rapidly gaining popularity with banks, security companies, law enforcement, retailers and – of course – social media platforms. And, like every tool from a claw hammer to a chainsaw or a steamroller, this tool can be used constructively or destructively. Or, if not actually destructively, at least creepily.

Facial recognition – the good

Considering how much data and financial information we have on our phones, it’s reassuring that no-one (except perhaps our twin sibling) can unlock our mobile devices. And it’s certainly a great way to effectively monitor access to, say, a residential estate. It’s non-invasive (in the physical sense), and it requires less proximity than either eye pattern or fingerprint recognition.

Facial recognition and data collection

Facial recognition depends on data and – as with many applications – the more data the system has access to, the more accurate it is. And wow – is that data out there! For the purposes of passing a moral judgement on the collection of data, we need to accurately define what data is – and how it differs from information. The data is out there, but it’s pretty much meaningless until it is manipulated.

Advertisement

As an example, say you decide you are going to budget more accurately. So you keep all your supermarket slips and, once home, pop them into a jar on the kitchen counter. That jar full of paper is data. But, as you can guess, it’s pretty meaningless. But then, at the end of the month, you collect all the slips, put them in order, and enter into a spreadsheet the rand value of every single purchase, and what you bought. The resulting spreadsheet is information. It can tell you, for example, that you spent five times as much on chocolate as on vegetables, and could explain why – in column D – you had to buy new jeans. (Discovery does something similar with their healthy food app.)

Facial recognition – the bad

So, having your face out there jumbled up with a whole lot of other faces in a metaphorical marmalade jar is not really that disconcerting. Think of that metaphorical marmalade jar as Facebook, Twitter, LinkedIn, etc., etc. All the social media platforms you use, have used, and will use in the future once they have been invented. Now – come on – you’re not that naïve, are you? What do you think happens with all those millions of faces that are conveniently linked to profiles and/or devices? (Oh yes, did you know that every photo you take with your phone identifies that it was taken with your phone? And the time and place, of course, but you knew that. Right?)

Now, technically, all that data (not information, data) is in the public domain, which means anyone can access it. It also means that there is no specific law against anyone harvesting it. At least not yet. And that’s exactly what US-based tech company Clearview has done – ‘harvested’ literally billions of images from various social media platforms, and analysed them against each other in order to identify each image. That is information, and when all that information is collated in a way that can be readily accessed and used, it’s a database.

Now here’s where it gets creepy. What do you think Clearview did with all that information? Sold it on, of course, but not, as you would expect, to the highest bidder. Their marketing strategy was to – initially – offer their database at a very low cost to any law enforcement agency. So, unless you are quite unusual, your face is in Clearview’s database, along with your name, and sufficient links to immediately identify where you went to school, who you were ‘in a relationship’ with for five months in 2012, and where you shop.

Facial recognition – interesting

Whatever your feelings are about the morality of private companies like Clearview collecting information about you and your children and selling it on, it is undeniable that a database like this can be useful. An example that was used in its marketing strategy by Clearview is the story of the identification, trial, conviction and imprisonment of Andres Rafael Viola. It all started when Yahoo identified some nasty photos that seemed to be documenting a case of sexual abuse of a child – and that showed one photo of a man staring straight at the camera. Long story short – the photo ended up with the Department of Homeland Security in the USA, where an investigator ran it through Clearview. The system came up with one hit – a posed shot of two bodybuilders (neither of whom looked remotely like the man in the photo) at a bodybuilding expo. But – behind them, almost out of shot, was a man at a booth selling supplements. That was the match, and his face was a minuscule part of the whole image. They traced him through the supplements company, found more images of the same abused girl on his computer, arrested him, tried him, convicted him and put him in jail, where he belongs, for 35 years. (Clearview have since stopped making their databases available to private companies, but they continue to sell to governments.)

Facial recognition – the ugly

Back in the previous century in South Africa, press photographers were constantly faced with the dilemma of how and if to use photographs of political events. Obviously, they wanted to publish photos that showed readers what was going on, but there was also the risk of the police identifying activists. It was easier in those days, of course. The photographer and/or film developer had control of the images, and of how they were used. But that is not the case now, and it’s ironic that the most public instance of photos being thus used is of pro-Trump right-wing protestors at the White House.

Closer to home, the Department of Home Affairs (DHA) stated specifically in its 2019/2020 Annual Performance Plan report that it fully intended to embrace facial recognition technology as part of a consolidated biometric approach to identification and verification – as well as to facilitate the rather sinisterly named enhanced movement control system (EMCS):

  • The Automated Biometric Identification System (ABIS), which will enable advance identification and verification through fingerprints and other selected modes of biometrics (palm-prints, iris, facial recognition and DNA), was launched on 16 May 2018. The ABIS will form the backbone of the future national identity system, which will replace the current national population register, using real-time data from the civic registers, the enhanced movement control system (EMCS) and the national immigration identification system (NIIS). The delivery of this project will happen in a phased approach.
  • The State Information Technology Agency (SITA) and the Council for Scientific and Industrial Research (CSIR) have completed comprehensive system conceptual design and specifications. Procurement of a service provider through SITA was completed in the 2017/18 financial year. All hardware to urgently tech refresh the environment has been procured. The development of the new system and data migration were planned for the 2018/19 financial year. This system will enable effective e-Government initiatives, with all departments and government entities that require instant identification and verification during service delivery having central access to ABIS.’

This does sound frighteningly like the good-old-bad-old Pass System we all thought had been done away with, but now you carry your dompas on the front of your head. Or am I just being paranoid?

Facial recognition and the POPI Act

While the POPI Act – specifically Section 26 – prohibits the processing of ‘personal information concerning the religious or philosophical beliefs, race or ethnic origin, trade union membership, political persuasion, health or sex life or biometric information (my emphasis) of a data subject,’ there are loopholes. Section 27 (1) states that the prohibition on processing personal information does not apply if, inter alia:

(a) ‘the processing is carried out with the consent of a data subject referred to in section 26

(e) the information has deliberately been made public by the data subject.’

So be very aware that any photos you have posted on social media have been made public, and be very careful before gaily accepting ‘terms and conditions,’ whether online or in the real world, because you may well be giving consent for your data to be used in any number of ways.

Facial recognition and estate security

There are a host of good reasons to include aspects of facial recognition in an estate’s suite of security products because – like any security tool – facial recognition can be constructively used to enhance the safety of the residents, guests and employees of an estate. But – also like almost any tool – it can be used destructively as well. What this really means is that before including any new aspect of security, you need to be 100% sure of how any data collected is going to be used. Oh, that’s so boring. We’ve said it so many times, and it is really common sense, but it bears repeating. Really, it does.

So here goes: Before you sign up for anything, be sure you know how your data will be used, stored and disposed of.

Share this

Leave a Reply

Your email address will not be published. Required fields are marked *


 

Scroll to Top
Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our mailing list and receive updates, news and offers
ErrorHere