LambethNewsSouthwark

Police facial recognition system has ‘potential to entrench racial bias’

A privacy campaign group has hit out at the Met’s facial recognition system, claiming it has the potential to “entrench racial bias” in policing.

Police are increasingly using facial recognition systems which cross check a live camera feed of faces against a database of known identities and custody images to make arrests.

The system was trialled to see if there was potential for bias and misidentification, but the Met said there were “no false matches” – where an image could be used to identify the wrong person in an arrest.

But the senior advocacy officer of Big Brother Watch, Madeline Stone, 28, from Bexley, said the reliability of the system should not be based on the algorithm alone.

She said: “After the Casey report, we know that certain communities are over-policed and over-surveilled and that people of colour are often disproportionately impacted by this over-policing. 

“If biased data is going into a system, you will get discriminatory results. In that way, facial recognition has the potential to entrench racial bias.”

The database used by the Met includes custody images of people who have been arrested and released with no further charge.

A Freedom of Information request submitted by the South London Press to the Met shows that outside of facial recognition systems use, 974 black people were arrested and released without being charged in the past six months in Lambeth, Lewisham and Southwark. 

This figure compares to 730 white people who were arrested and released without charge in the three boroughs in the same timeframe.

In each month from May until October, the number of black people arrested and released without charge in Lambeth and Southwark was consistently higher than the number of white people arrested and released without charge.

Privacy campaigners believe these figures show biased behaviours by police officers creates unfair data that will result in biased identifications by facial recognition technology.

Ms Stone said: “Discriminatory policing, combined with technology, could mean that people of colour are more likely to be placed on facial recognition databases and more likely to be misidentified by facial recognition.”

The retention of custody images of innocent people came under fire in 2012 after the High Court ruled that the Met had breached the human rights of a woman and boy they arrested by keeping their custody pictures after deciding to take no action against them.

People can apply to the police to have their images deleted after the conclusion of proceedings, but the Met are entitled to refuse their application.

According to Home Office statistics from 2016, 1,003 people applied to have their police records deleted, of which 233 were accepted by the police.

Ms Stone said: “The Met should be deleting these images, but police forces have claimed it’s too difficult. That’s unacceptable.”

Ms Stone said that using technology can distance police from criticism and “encourage a lack of accountability”.

She said: “There is no legislation that mentions facial recognition right now and MPs haven’t even debated this technology. The police use this technology how and where they like.”

A spokesman from the Met said: “Facial recognition technology is a community crime-fighting tool that enables us to be more precise and focussed in how we protect the public and tackle crime.

 “Transparency is a key part of building trust and confidence within all of our communities. We are committed to being open in how we use facial recognition and we publish a vast number of documents that allows the public to see exactly how we use the technology.

 “We know through independent testing carried out by the National Physical Laboratory that the algorithm we use for live facial recognition, at the setting we use it at, performance is the same across race and gender. The chance of a false match is a maximum of 1 in 6000 people who pass the camera.

“The College of Policing sets out guidance for the deployment of live facial recognition technology to locate a person on a watchlist. Live facial recognition watchlists are specifically created for each deployment using intelligence and relate to specific crime types and not individuals.

“The watchlist is deleted at the end of the deployment.”

Pictured top: A camera being used during trials at Scotland Yard for the new facial recognition system (Picture: PA)


Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.


Everyone at the South London Press thanks you for your continued support.

Former Housing Secretary Robert Jenrick has encouraged everyone in the country who can afford to do so to buy a newspaper, and told the Downing Street press briefing:

“A FREE COUNTRY NEEDS A FREE PRESS, AND THE NEWSPAPERS OF OUR COUNTRY ARE UNDER SIGNIFICANT FINANCIAL PRESSURE”

If you can afford to do so, we would be so grateful if you can make a donation which will allow us to continue to bring stories to you, both in print and online. Or please make cheques payable to “MSI Media Limited” and send by post to South London Press, Unit 112, 160 Bromley Road, Catford, London SE6 2NZ

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.