- The Metropolitan Police will be scanning Christmas shoppers in the UK this week as part of a facial recognition trial.
- Previous tests were scheduled during the Notting Hill Carnival and Remembrance Day services.
- Soho, Piccadilly Circus, Leicester Square, and other major shopping hubs will be part of the trial.
Amidst privacy concerns, facial recognition is slowly being adopted by law enforcement all over the world. The Metropolitan Police is set to run its seventh public trial of facial recognition software this week with Christmas shoppers in London becoming part of the test.
The previous trial runs by the Metropolitan Police have been during the Notting Hill Carnival in 2016 and 2017 and last year’s Remembrance Day services. The venues selected for the trial include popular shopping areas like Soho, Leicester Square, Piccadilly Circus among other locations.
The cameras will be deployed by the Metropolitan Police on lampposts and police vans, and the officials will take advantage of software developed by NEC. The Japanese company designed the software for the surveillance cameras, and it is capable of measuring the structure of passersby. All citizens who are scanned will have their faces compared against police mugshots from the official database, and all matches will be examined.
Non-profit non-party British civil liberties and privacy campaigning organization Big Brother Watch spoke out against the use of facial recognition by the Metropolitan Police stating “Individual officers don’t have the ability to biometrically scan every face in a crowd; to scan millions of people against watch lists of hundreds of people, or to automatically record biometric images of people walking by. Automated facial recognition significantly compromises the privacy of ordinary members of the public in a manner entirely different to super recognizers.”
Facial recognition software has been proven to be fairly inaccurate, and many pro-privacy organizations are of the opinion that even a small error margin is unacceptable. Even if a facial recognition software is accurate 95% of the time, 1 in 20 individuals can be wrongly accused of crimes and may face unneeded harassment.