Police Scotland’s use of a controversial form of facial recognition has tripled over the last five years, and is on course to rise even further in 2023, heightening concerns that suspects could be wrongly identified.
The force has increased its use of retrospective facial recognition, where algorithms help identify people caught on camera at crime scenes by comparing their faces with millions of custody images stored on a UK-wide police database.
Police Scotland’s searches of the facial matching function on the police database jumped from just under 1,300 in 2018 to nearly 4,000 in 2022.
This made the Scottish police the fourth most prolific force in the UK for using retrospective facial recognition last year. More than 2,000 searches were carried out in the first four months of 2023, suggesting the rate is continuing to accelerate.
Police Scotland has described the technology – where a shortlist of matches is generated and then reviewed by an officer retrospectively – as an “extremely useful tool in helping officers to identify those who commit crimes”, including child exploitation.
But campaigners and politicians have pointed to evidence that facial recognition technology can throw up “false positives” – people wrongly matched to the suspect by the software – and has been shown to be biased against women and people of colour.
One expert, who has advised the Scottish Government on facial recognition in the past, said Police Scotland should suspend its use of the technology until it can prove there is “robust evidence” that its use is “appropriate, proportionate and effective”.
The police database holds millions of images of people held in custody by police forces across the UK, including many who were found to be innocent and released without charge or later cleared at trial.
According to the Home Office data, the total searches by all forces in the UK increased from 3,360 in 2014 to 85,158 in 2022. Last year nearly 30 per cent – 24,677 – of searches were carried out by the Metropolitan Police.
Other forces whose searches increased dramatically include Merseyside Police, who made 125 searches in 2014 and 3,896 last year. Police Scotland was just behind, clocking up a total of 3,848 searches in 2022.
In 2012 the high court ruled that the Metropolitan Police was in breach of human rights legislation because it had retained images of a woman and a boy arrested despite taking no further action against them. Five years later then Home Secretary Amber Rudd said police forces must delete images of innocent people at their request. However, campaigners say many images of innocent people remain on the databases.
Police Scotland operates a distinct policy to other forces, only uploading custody images to the database once an individual has been charged with a crime and removing images of those found innocent.
But the removal of the custody images of people found innocent at trial can take up to six months. Scots who are charged by forces in other parts of the UK will remain on the database even if proven innocent, unless they specifically ask to be removed.
Police Scotland also retains custody images from old IT systems used by local forces before the national force was set up. These include pictures of those convicted of any past crime, including minor ones, who may still show up in searches today.
The UK database – known as the Police National Database (PND) and managed by the Home Office – has a facial search function which compares images of suspects caught on CCTV, mobile phones or dash cams with custody images. A shortlist of potential facial matches is produced and then reviewed by an officer to decide if any match the suspect.
It was introduced following the Soham murders of school girls Jessica Chapman and Holly Wells in 2002. Their killer Ian Huntley was known to various English police forces but information was not shared between them.
Despite police claims of its effectiveness, research has shown that most facial recognition algorithms perform poorly when identifying anyone who isn’t a white man. This means women and people with darker skin tones are more likely to face wrongful arrest after they were incorrectly identified by the technology.
Liam McArthur MSP, justice spokesperson for the Scottish Liberal Democrats, criticised the “exponential rise” in the use of retrospective facial recognition by Police Scotland “despite ample evidence about the shortcomings and risks involved”.
In October 2019 an inquiry into the issue was held by the Scottish Parliament’s justice sub-committee. A subsequent report welcomed confirmation from Police Scotland that they have no intention to introduce the ‘live’ use of facial recognition technology – where suspects can be monitored and identified in real time –- despite a stated aim to do so by 2026 in its 10-year-strategic plan.
But it also recommended that the use of retrospective facial recognition be reviewed by the Scottish Police Authority and the Scottish Biometrics Commissioner – a new role tasked with overseeing Police Scotland’s use of data that can be used to identify people.
McArthur said: “We know, for example, that facial recognition throws up far too many ‘false positives’ and contains inherent biases that are known to be discriminatory.
“Of course, the police should be able to make the best use of available technology in their efforts to combat crime and keep communities safe. They must do this, however, with appropriate safeguards in place and with a clear legal framework underpinning the use of tools like facial recognition.”
Professor Angela Daly, a member of the Scottish Government’s independent advisory group on new technologies in policing, told The Ferret the sharp escalation in the use of retrospective facial recognition technology was a “concerning development” which should lead to a further “public, democratic discussion about what is acceptable in terms of police use of technology”.
Daly argued retrospective facial recognition should be suspended until Police Scotland can prove its use is “appropriate, proportionate and effective and that it has social licence from the public to do so”.
Patrick Corrigan, head of nations and regions at Amnesty International UK, argued for an “outright ban” on facial recognition because of its potential to “exacerbate systemic racism” in policing.
“That there seems to be an increase in use of the PND by Police Scotland is in no way coherent with its commitment to operate as a rights-based organisation,” Corrigan added.
The Scottish Biometrics Commissioner, Brian Plastow, who is also a former police chief superintendent, claimed he didn’t have concerns about the number of searches being carried out. But he said Police Scotland should do more to improve public confidence and trust by publishing data for transparency purposes.
He questioned the reliability of facial recognition software, claiming it was “not validated or accredited to any recognised international scientific standard”. “By contrast, the technologies used for DNA and fingerprints conform to international scientific standards and the processes deployed are accredited by the UK Accreditation Body (UKAS),” he added.
A review of the retention of images on legacy systems is being conducted, he said, and he is due to report to the Scottish Parliament in March 2024.
A Police Scotland spokesperson said: “Police Scotland uses facial matching technology provided through the Home Office’s Police National Database. It is proven technology and can help protect our communities from criminals.
“It is also an extremely useful tool in helping officers to identify those who commit crimes including child abuse and exploitation, fraud and extortion, which continue to grow with increasing sophistication.”
The Scottish Government spokesperson said the decision to use any technology with facial recognition capability was an operational manner for Police Scotland.
In 2015 Glasgow City Council attempted to introduce surveillance cameras, provided by Israeli technology firm NICE, which it said would allow council staff to monitor crowds and look out for intruders or people loitering in parks, landmarks and back lanes, based on information about their clothing, height and other distinguishing features.
However this ‘Suspect Search’ system has still not been used due to data protection laws and concerns raised by civil liberties campaigners and trade unions, as previously reported by The Ferret.
At the Conservative Party conference in October, the UK policing minister, Chris Philp, said he wanted to add people’s passport photos into the PND. The proposal was condemned by biometrics commissioner, Brian Plastow, who said it was “unethical and potentially unlawful”.
Philp also wrote to police chiefs in England and Wales in October urging them to double their use of retrospective facial recognition.
Main image: inkoly/iStock/Canva
This Ferret story was also published with Sunday National. Our partnerships with other media help us reach new audiences and become more sustainable as a media co-op. Join us to read all our stories and tell us what we should investigate next.