Predictive policing is ‘modern method of racial profiling’, says Amnesty
The Met’s use of technology to predict where crime will happen and who will commit crime is racist, a new report from Amnesty International says.
The report, ‘Automated Racism‘, says predictive policing is so “dangerous” and “discriminatory” it should be banned.
Black people are repeatedly targeted by police and therefore over-represented in police intelligence, stop-and-search or other police records, Amnesty says.
This means, the data driving the predictive systems relies on “racist” police practices already established within the force.
The report, published last week, said: “These systems are, in effect, a modern method of racial profiling, reinforcing racism and discrimination in policing.”
The police say predictive policing helps to tackle and prevent crime, by concentrating resources in areas they are most needed.
Risk Terrain Monitoring (RTM) is a predictive policing system that uses police data to generate a location-based risk score. An initial period of RTM targeted the north of Lambeth and Southwark from September 2020 onwards.

Between December 2020 and October 2021, Lambeth had the second highest volume of stop and search of all London boroughs, according to the report.
People of “black ethnic appearance” had the highest rate of stop and search encounters of any ethnic group, four times more than people of “white ethnic appearance”.
According to Amnesty, 80 per cent of these stops and searches resulted in no further action.
In the same period, Lambeth had the second highest volume of police using force in all London boroughs, with the majority of incidents against people recorded as “black or black British”.
In Southwark in the year ending March 2021, Black people were stopped and searched 3.3 times more than white people. Police used force against people in Southwark at least 8,924 times between September 2020 and 2021, 45 per cent of these were against “black or black British” people.
Sacha Deshmukh, chief executive at Amnesty International UK, said: “No matter our postcode or the colour of our skin, we all want our families and communities to live safely and thrive.
“The use of predictive policing tools violates human rights. We are all much more than computer-generated risk scores.”

The Met police is one of 33 forces across the UK to have used predictive profiling or risk prediction systems, according to Amnesty.
Of these, 32 have used “geographic crime prediction, profiling, or risk prediction tools”.
Profiling is where individuals are placed on a database and profiled as someone “at risk” of committing certain crimes based on intelligence and suspicion of involvement in crime, the report says.
Shaun Thompson, 39, who volunteers with anti-knife crime group Street Fathers, was returning home from a shift in Croydon in February 2024, when he was misidentified by the Met’s facial recognition database outside London Bridge station.
The Southwark resident said he was detained by police for half an hour.
He said: “They were telling me I was a wanted man, trying to get my fingerprints and trying to scare me with arrest.”
Mr Thompson said he was only let go after handing over a copy of his passport.
He said: “Trust is crucial to combatting knife crime and youth violence. If police increase their reliance on technology which is biased it could make it harder to build trust in communities that feel over-policed or misrepresented.

“If policing is predictive, and not rooted in on the ground experience, it can increase feelings of injustice and discrimination in targeted communities.”
In June 2024, Mr Thompson launched a legal challenge against the Met.
Senior advocacy officer of Big Brother Watch, Madeline Stone, 29, from Bexley, said: “Anyone flagged by this technology is guilty until proven innocent, reversing a vital principle of the British justice system.”
A spokesman for the National Police Chiefs’ Council said: “Policing uses a wide range of data to help inform its response to tackling and preventing crime, maximising the use of finite resources. As the public would expect, this can include concentrating resources in areas with the most reported crime.
“Hotspot policing and visible targeted patrols are the bedrock of community policing, and effective deterrents in detecting and preventing anti-social behaviour and serious violent crime, as well as improving feelings of safety.
“It is our responsibility as leaders to ensure that we balance tackling crime with building trust and confidence in our communities whilst recognising the detrimental impact that tools such as stop and search can have, particularly on Black people.”
Pictured top: Lines of police March through central London (Picture: Claudia Lee)