Big Data Intersects with Big Brother

In London, fears grow about facial recognition in private places

Last week, Britain’s data protection watchdog launched an investigation into the use of facial recognition by a property developer in King’s Cross in London. It emerged that visitors to the area around King’s Cross railway station were covertly scanned. As artificial intelligence takes centre stage, data systems have become very talented at identifying people by matching a scan of their facial features against a photograph, although the technology is still prone to errors. In other words, people aren’t just being watched, they’re being identified.

Authorities in the UK say that they’re concerned about such use of facial-recognition technology in public spaces. Even the mayor of London has expressed concerns about this trend and has demanded information about the King’s Cross data collection.

Big Brother Watch, a civil liberties group, found that shopping centres, museums, conference centres and casinos had all used the software that compares faces captured by CCTV with those of people on watch lists, such as suspected terrorists and shoplifters. The group issued a report claiming that the use of facial recognition in the UK was reaching “epidemic” levels. It says that live facial-recognition technology is regularly being deployed at private sites across the country.

This means, according to the group, that “many millions” of citizens have unknowingly had their faces scanned with this type of 21st-century surveillance. For example, the technology was employed in “secret police trials” in Sheffield’s Meadowhall shopping centre in 2018, scanning as many as 2 million visitors’ faces without their knowledge.

Advanced facial-recognition technology is so new that there are few laws in place specifically regulating its use. This is the case in the UK, where private companies have tapped into the power of facial recognition for various purposes, as it’s not currently governed by a specific legal framework. Private businesses can therefore implement such systems without declaring the move publicly or notifying authorities. However, the technology is regulated by privacy laws and the Data Protection Act 2018, which gives anyone scanned the right to be informed about how their image has been collected and used, making this a grey area.

Facial recognition is increasingly becoming pervasive, as governments and businesses around the globe are investing in sophisticated solutions. In the past few months we’ve written about the growing use of facial recognition, including in China, where the government has embraced the technology to track people who have landed on a creditworthiness blacklist (see, for example, Location-Based Debt Shaming and Biometrics Backlash). The technology has proven to be a boon for law-enforcement agencies, but is now becoming a part of private companies’ big data efforts, taking customer relationship management products to new levels.

As is often the case, advanced security solutions that are first funded and used by federal agencies trickle down and find commercial uses. There’s now concern and backlash against something that only a few years ago seemed to belong in the science-fiction world: the real-time recognition and tracking of people going about their everyday lives. Calls for regulation to address this new technology are growing louder, but for now, it’s clear that Big Brother is intersecting with big data.