Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facial recognition doesn't work – but that won't stop it from coming after you

There have been enough warnings about the technology and its effectiveness that the government should be holding back on any roll out

Kuba Shand-Baptiste
Wednesday 12 February 2020 17:55 GMT
Comments
Related video: Kit Malthouse says facial recognition will make the search for suspected criminals 'quicker and more effective'
Related video: Kit Malthouse says facial recognition will make the search for suspected criminals 'quicker and more effective' (David McNew/AFP/Getty)

If you had the pleasure of strolling through Stratford in London on Tuesday, it may well have caught you.

Despite numerous warnings against its use, the Metropolitan Police Service has gone ahead with its live facial recognition (LFR) programme, rolling it out in front of the Stratford Centre shopping complex. Thousands of people will have passed, allowing cameras to capture their faces, to check against a database of thousands of people “wanted for serious criminality”.

We’re well past edging towards a fictional dystopian future at this point – we’re living it right now.

If it sounds dramatic, that’s because it is. This isn’t a matter of being unnecessarily spooked by new, but otherwise helpful, technology. In fact, this may be less worrying if it were. We’re living in dangerous times, after all. Who better to protect us from it than the police and thousands of cameras?

In a world where nearly hundreds of millions of selfies are taken per day, you may be wondering what exactly the big deal is with the police capturing our likeness. You may believe that your innocence will protect you. But none of that matters. And the police’s own data proves it.

Privacy and civil liberties advocacy group Big Brother Watch did some digging into various police forces’ use of facial recognition technology (FRT) across the country between 2017 and 2019. What it found was alarming. In 98 per cent of cases, the Metropolitan Police wrongly identified people through automatic facial recognition matches at the Notting Hill Carnival in 2017. For South Wales Police, it was 91 per cent across a number of deployments.

Eight trials carried out in London between 2016 and 2018 resulted in a 96 per cent rate of “false positives”. Meanwhile Professor Pete Fussey, from Essex University, who was in charge of the only independent review of the Met’s public trials on behalf of the force, said that he found the technology to be verifiably accurate in 19 per cent of cases. The Met has previously claimed the technology is 70 per cent accurate.

The likelihood of misidentification rises considerably when it comes to people of colour and women. If you happen to be both, think about what worry that could bring if FRT is extended across the rest of the UK. The government has already been warned that the roll out of the technology could be a “spectacular own goal” if it is not sufficiently accurate.

It’s not just that this technology wrongly or rightly figures out who you are – it’s what happens to the information that’s collected that you should be concerned about too.

You may recall the revelation in 2019 – after first denying any involvement – that the Metropolitan Police had supplied images for a private company database that carried out LFR scans in King’s Cross. When investigated by the Information Commissioner, it found that the use of the technology in public spaces by Argent, the company in question, posed a “potential threat to privacy that should concern us all”.

Last year, in an attempt to protect his privacy, a man was fined for covering his face while passing through surveillance cameras in spite of the Metropolitan Police’s assertion that “anyone who declines to be scanned will not necessarily be viewed as suspicious”. The warnings have clearly rung hollow. In 2018, the Met’s equality impact assessment of LFR technology suggested that the biometrics commissioner “supported” it. But a statement released on 11 February 2020 by Professor Paul Wiles, the commissioner, says the opposite. He wrote: “I have continually said that we need proper governance of new biometric technologies such as LFR through legislation. In my view it is for parliament to decide whether LFR ought to be used by the police and if so for what purposes.”

MPs and MSPs have already spoken out about the problems this technology has caused and will continue to cause. Despite the Met’s promises of deploying the technology “responsibly” and “overtly” – mostly in terms of warning members of the public when LFR is in operation – it’s clear the measures don’t go far enough. Those walking through Stratford on Tuesday were said to have been reassured with signs reading that “there is no legal requirement for you to pass through the LFR system”, but Sian Berry, co-leader of the Green Party, who went down to Stratford Centre to see the operation, said many were already in full view of the cameras before the sign was even visible.

As Labour’s Chi Onwurah told MPs last month, facial recognition “automates the prejudices of those who design it and the limitations of the data on which it is trained”. That should worry all of us.

This is an issue that spans the world, not just the UK. In the US, a company called Clearview, which is courting clients in law enforcement, is currently facing criticism for claims of accuracy when it comes to its FRT. It claimed that its technology is accurate for “all demographic groups”, despite carrying out tests on just 834 politicians.

The European Commission too said it was considering a three- to five-year ban on FRT in public spaces, in order to allow for “a sound methodology for assessing the impacts of this technology and possible risk management measures”.

Boris Johnson’s government has shown in recent days – with the disputed deportation of people to Jamaica – that it has no qualms about pushing the envelope in regard to law and order. The use of FRT is an extension of that mindset.

Apathy is the enemy when it comes to the creeping use of such technology. We may not be able to stop FRT in its tracks, but we can certainly make enough noise to give authorities pause.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in