Facial recognition technology used by retailers including B&M, Sainsbury’s and Home Bargains is leading to false accusations of shoplifting around the UK, a Guardian report has revealed.
Multiple shoppers across the country informed the paper that they had been approached by security staff after being incorrectly identified as thieves by shops using Facewatch’s AI-powered facial recognition software.
On its website, Facewatch claims a 99.98 per cent accuracy rate, and sent over 50,000 positive alerts of known offenders in March alone.
However, affected shoppers told the Guardian they were distressed and embarrassed by the errors, which led to them getting thrown out of stores, or followed around them by security guards. They also found the process for lodging a complaint with Facewatch opaque, with one describing it as a process of “guilty until proven innocent”.
Warren Raja, another affected shopper and data strategist of Asian descent, noted that these cameras were notoriously poor at differentiating between people with darker skin, something admitted by the Home Office in December.
“We already live in a country that has issues with racism, it’s an unavoidable issue,” he told The Guardian. “And we know cameras cannot pick up features of people that have darker features with as much accuracy. And this could be happening to people who are much more vulnerable than me.”
The technology is also increasingly used by police forces on High Streets around the country. On Sunday, the biometrics commissioner for England and Wales, Prof. William Webster, told the paper in a separate story that facial recognition technology legislation was “trying to catch up with the real world”. Scotland’s biometrics commissioner added that the technology, was “nowhere near as effective as the police claim it is”.
The story also reported a whistleblower’s claim that shop-based face-scanning systems had been “maliciously” misused by shop or security staff to add members of the public to watchlists.
In response to the Guardian, Facewatch’s chief executive, Nick Fisher, said: “We are aware of the matters referenced and in each case, we acted promptly once they contacted the Facewatch data protection team.
“These cases relate to human error in the way processes were carried out in-store, rather than any failure of Facewatch’s technology. We are sorry these individuals experienced being challenged while shopping and understand why this would have been upsetting.
“These three errors are extremely rare cases when viewed in the context of the more than 500,000 alerts we send to retailers each year, but we recognise that any mistake is upsetting for the individual concerned. The system is designed to support, not replace, human decision-making.”
While Facewatch is the only legally-compliant service offering live facial recognition in stores, non-live facial recognition provider Auror is also gaining ground in the retail industry. It is currently in use at several retailers, including Marks and Spencer, and was trialled for 10 weeks at Tesco stores in January.










Recent Stories