Facial recognition identifies people wearing masks
Japanese company NEC, which develops facial-recognition systems, has launched one that can identify people even when they are wearing masks.
It hones in on parts of the face that are not covered up, such as the eyes, to verify their identity.
Verification takes less than one second, with an accuracy rate of more than 99.9%, NEC says.
The Met Police uses NEC’s NeoFace Live Facial Recognition to compare faces in a crowd with those on a watchlist.
Other clients include Lufthansa and Swiss International Airlines.
And NEC is trialling the system for automated payments at a shop in its Tokyo headquarters.
Shinya Takashima, assistant manager of NEC’s digital platform division, told the Reuters news agency the technology could help people avoid contact with surfaces in a range of situations.
It had been introduced as “needs grew even more due to the coronavirus situation”, he added.
Before the coronavirus pandemic, facial-recognition algorithms failed to identify 20-50% of images of people wearing face masks, according to a report from the National Institute of Standards and Technology.
But by the end of 2020, it reported a vast improvement in accuracy.
Facial recognition has proved controversial.
There have been questions over how well systems recognise darker shades of skin, alongside ethical concerns about invasion of privacy.
In August, the use of such systems by Welsh police forces was ruled unlawful in a case brought by a civil-rights campaigner.
And in the US big technology companies, including Amazon and IBM, have suspended the use of facial-recognition software by police officers, to allow lawmakers time to consider legislation on how it should be deployed.
A trial of facial recognition technology within 18 Co-op food stores has sparked outrage from privacy advocates.
The system, from start-up Facewatch, alerts workers if someone enters the store who had a past record of “theft or anti-social behaviour”.
The supermarket said the pilot was done to protect workers from assaults by shoplifters.
Privacy groups say they are “deeply concerned” by the trial.
The initiative was organised by the Southern Co-operative, which is independent of the larger Co-op chain but runs more than 200 stores in the south of England using the same brand.
The trial was first reported by Wired’s news site, which picked up on a blog posted on Facewatch’s website by Southern Co-op’s loss prevention officer Gareth Lewis.
Mr Lewis wrote that the retailer has completed a “successful trial using Facewatch in a select number of stores where there is a higher level of crime”.
The technology is still being used in those stores but there are no plans to roll it out more widely, the firm told the BBC.
In an open letter to the retailer, Privacy International questioned the legality of the technology in stores. It also asked whether information was being shared with the police.
Director of civil rights group Big Brother Watch, Silkie Carlo, said: “To see a supposedly ethical company secretly using rights-abusive tech like facial recognition on its customers in the UK is deeply chilling.
“This surveillance is well-known to suffer from severe inaccuracy and biases, leading to innocent people being wrongly flagged and put on criminal databases.
“Live facial recognition is more commonly seen in dictatorships than democracies. This is a serious error of judgement by Southern Co-op and we urge them to drop these Big Brother-style cameras immediately.”
Assaults and violence
The Southern Co-operative said there were clear signs about the system in the stores involved in the trial and it was GDPR-compliant.
In a statement to the BBC, it added that no data had been shared with police.
“Already this year, we have seen an 80% increase in assaults and violence against our store colleagues.
“The purpose of our limited and targeted use of facial recognition is to identify when a known repeat offender enters one of our stores.
“This gives our colleagues time to decide on any action they need to take, for example, asking them to politely leave the premises or notifying police if this is a breach of a banning order.”
It added that violence in stores occurred when “a colleague intervenes after a theft has already taken place” and using facial recognition “improved the safety of our store colleagues”.
The Co-op is not the only supermarket to use image recognition technology to catch thieves.
This summer, Sainsbury’s trialled an AI-enabled concealment detector in several stores, which was able to spot if a customer had pocketed an item and send a short video to security staff.
The supermarket partnered with start-up ThirdEye, which claims the system stopped 5,591 attempted thefts.
Facial recognition tech has proved controversial, with questions over how well it recognises darker shades of skin, alongside ethical concerns about invasion of privacy.
In August, the use of the technology by British police forces was ruled unlawful in a case brought by a civil rights campaigner.
And in the US, big tech firms including Amazon and IBM, have suspended the use of facial recognition software by police to allow lawmakers to consider legislation on how it should be deployed.
Facewatch describes itself as a “cloud-based facial recognition security system” and works with retailers in Argentina, Brazil and Spain.
Last year, it was reported the firm was on the verge of signing data-sharing deals with the Metropolitan Police and the City of London police, and was in talks with constabularies in Hampshire and Sussex.