Big Brother-style CCTV planned for Glasgow is a “very significant” threat to civil liberties and workers’ rights, according to trade unionists.
The Scottish Trades Union Congress (STUC) is concerned that new state of the art surveillance by cameras across the city could be used to discriminate against people if left unregulated.
The STUC, which represents 540,000 trade unionists, is calling on Glasgow City Council to consult with unions over concerns about the planned use of hi-tech CCTV analytics software called Suspect Search.
The sophisticated £1.2 million surveillance system, which has been delayed for five years due to data protection laws, formed part of a £12.6 million upgrade to Glasgow’s public space CCTV system.
The Ferret revealed in 2015 the real world capabilities of Suspect Search and how individuals can be tracked in a city as they move around.
We also reported in 2016 that 70 CCTV cameras in Glasgow had the software but that neither the Scottish Government, Police Scotland, nor Glasgow City Council held any written legal guidance on how the system can be used.
The high-tech system is able to assign a unique signature to each person who walks past a camera, in real time, and can then track their movements through the city.
Camera operators can start searches for people by uploading a photograph of the person they wish to find. Alternatively they can build an online avatar of their target, based on a description of someone’s clothes and physical characteristics.
Suspect Search was installed by the former Community Safety Glasgow, an agency set-up by Glasgow City Council and Police Scotland. They said later that the introduction of the EU’s General Data Protection Regulation had caused delay.
Glasgow City Council describe the system as “person search” as opposed to “Suspect Search”, while claiming it is not facial recognition software.
But the STUC’s general secretary, Grahame Smith, said that Suspect Search raises “very significant civil liberties and human rights issues, both in workplaces and out”.
“The STUC has yet to see any justification for its use or any assessment of how it will impact on people’s rights and liberties,” Smith added.
“Like with all new surveillance technologies, there needs to be transparency around the need for facial recognition, how it will be used and the impact it will have on members of the general public and workers in the areas where it will be deployed.”
He continued: “We know through research produced by the Trades Union Congress that many workers are concerned that surveillance on them while working could be used in a discriminatory way if left unregulated. This is a key concern of unions.
Glasgow City Council should be consulting immediately with unions who represent workers in the vicinity of where these surveillance cameras will be placed. Grahame Smith, Scottish Trades Union Congress
“Glasgow City Council should be consulting immediately with unions who represent workers in the vicinity of where these surveillance cameras will be placed.”
The STUC’s concerns over Suspect Search were echoed by UNISON Scotland, Europe’s largest public sector union which has 150,000 members in Scotland.
Sam Macartney, chair of UNISON Scotland International Committee, told The Ferret: “We’re concerned from a members’ point of views as in the past we had Glasgow Life (an organisation working with the council) trying to use recordings to discipline or sack people.
“In the past we’ve been able to offset this because they are not allowed to use technology on an individual who is working without, first of all, making them aware they’re being recorded.
Macartney added: “This changes the goal posts completely so we have major concerns…it’s myself and others who have to represent people if they have to go to a tribunal to get their job back – so it’s major concern as it blows everything out of the water.”
Glasgow City Council pointed out that there was no confirmed date for when the software would be put into operation as it was still going through the legal and approval process. “Once approved, the system will only be operated by vetted staff and will aid operators in speeding up the process in looking for a missing child, for example, in the city,” said a council spokesman.
He stressed that Suspect Search did not use facial recognition technology. “Suspect Search software allows operators to use a variety of other information to search camera feeds,” he stated.
“So, for example, they may know they are looking for a female child of a certain age and height, wearing clothing of a particular colour and type, wearing a hat, carrying a bag etc. That allows them to build an avatar and search for potential matches, which the operator would then view.”
The Ferret previously reported that in trials Suspect Search was able to track people successfully, even in busy streets with hundreds of people walking past.
As well as tracking the movements of people, the software can trigger alerts to camera operators if crowds gather unexpectedly, or when people enter areas defined by the camera operators.
One of the first successful tests carried out was to see if the system could automatically spot people putting a traffic cone on the Duke of Wellington statue in the city centre.
The developers of the Suspect Search software were an Israeli security firm called NICE Systems. It changed its name to Qognify in 2015 after its physical security business was bought by Battery Ventures, which is headquartered in Boston, US.
As reported by The Ferret last year, the Scottish Government plans to appoint an independent watchdog to monitor how facial recognition is used. MSPs criticised the the lack of regulation of facial images held by Police Scotland as long ago as June 2015.
This led to a January 2016 report by HM Inspectorate of Constabulary in Scotland which confirmed that facial images of 334,594 Scottish people held by Police Scotland had been uploaded to the Police National Database.
Earlier this month, police in England faced calls to end the use of facial recognition software to search for suspected criminals in public after independent analysis found matches were only correct in a fifth of cases.
Academics from the University of Essex were granted access to six live trials by the Metropolitan Police in Soho, Romford and at the Westfield shopping centre in Stratford, east London.
They found the system regularly misidentified people who were then wrongly stopped and that it was unlikely to be justifiable under human rights law.
They also warned of “surveillance creep”, with the technology being used to find people who were not wanted by the courts.