Rights groups have expressed serious concerns over reports that a UK supermarket chain recently completed a trial of controversial facial recognition technology.
The Southern Co-operative, whose stores cover 10 counties across the south of England, revealed the trial in a little noticed update dated two months ago. However, recent media coverage has spurred interest from privacy campaigners.
London-based Privacy International said it has written to the chain requesting urgent assurances over its partnership with UK startup Facewatch. Describing its technology as a cloud-based facial recognition system designed to safeguard businesses against crime, it says the tech “sends you instant alerts when subjects of interest enter your business premises.”
Aside from concerns over whether the Co-op’s use of Facewatch complied with strict data protection and privacy laws, the rights group wants to know whether the trial may have exposed innocent shoppers to unwarranted police scrutiny.
“In October 2020, Privacy International urged authorities in the UK to investigate evidence that Facewatch is offering to transform its crime alerting system into another surveillance network for UK police forces, by offering them the ability to ‘plug-in’ to the system. We are still awaiting responses,” it said.
“We are concerned that such a deployment at Southern Co-op stores – even at trial level – could mean that, in order to purchase essential goods, people might be in effect left with no choice but to submit themselves to facial recognition scans.”
An October blog penned by the supermarket chain’s loss prevention officer, Gareth Lewis, explained that the trial covered a select number of stores that had experienced high levels of crime. Customers were made aware of the trial by “distinctive signage,” and no facial images are stored “unless they have been identified in relation to a crime.” He claimed this made it GDPR-compliant.
Ray Walsh, digital privacy expert at ProPrivacy, raised similar concerns to Privacy International.
“The problem with allowing private businesses to use real-time scanning is that the facial recognition cameras could theoretically become part of constant real-time surveillance leveraged by the police or other government agencies, of the like seen in countries such as China,” he argued.
“These systems mean everyone who ventures out in public is being constantly scanned. It is vital for regulation on the use of facial recognition to offer transparency over how the general public’s data is collected, stored, processed and transported to ensure privacy and data security."
The Southern Co-operative, whose stores cover 10 counties across the south of England, revealed the trial in a little noticed update dated two months ago. However, recent media coverage has spurred interest from privacy campaigners.
London-based Privacy International said it has written to the chain requesting urgent assurances over its partnership with UK startup Facewatch. Describing its technology as a cloud-based facial recognition system designed to safeguard businesses against crime, it says the tech “sends you instant alerts when subjects of interest enter your business premises.”
Aside from concerns over whether the Co-op’s use of Facewatch complied with strict data protection and privacy laws, the rights group wants to know whether the trial may have exposed innocent shoppers to unwarranted police scrutiny.
“In October 2020, Privacy International urged authorities in the UK to investigate evidence that Facewatch is offering to transform its crime alerting system into another surveillance network for UK police forces, by offering them the ability to ‘plug-in’ to the system. We are still awaiting responses,” it said.
“We are concerned that such a deployment at Southern Co-op stores – even at trial level – could mean that, in order to purchase essential goods, people might be in effect left with no choice but to submit themselves to facial recognition scans.”
An October blog penned by the supermarket chain’s loss prevention officer, Gareth Lewis, explained that the trial covered a select number of stores that had experienced high levels of crime. Customers were made aware of the trial by “distinctive signage,” and no facial images are stored “unless they have been identified in relation to a crime.” He claimed this made it GDPR-compliant.
Ray Walsh, digital privacy expert at ProPrivacy, raised similar concerns to Privacy International.
“The problem with allowing private businesses to use real-time scanning is that the facial recognition cameras could theoretically become part of constant real-time surveillance leveraged by the police or other government agencies, of the like seen in countries such as China,” he argued.
“These systems mean everyone who ventures out in public is being constantly scanned. It is vital for regulation on the use of facial recognition to offer transparency over how the general public’s data is collected, stored, processed and transported to ensure privacy and data security."