You know how most dystopian future thrillers feature facial recognition cameras that authoritarian governments use to track and control their citizens’ every move? Amazon is working hard to make exactly that sort of facial recognition a reality.
To make matters even worse, it has recently been revealed in the UK that some modern facial recognition systems have shockingly bad accuracy, returning mistaken identities as much as 98% of the time. I haven’t been able to find stats on the accuracy of Amazon’s Rekognition system, but the enthusiasm of governments to embrace questionably accurate technology for profiling their citizens is nonetheless troubling.
What is Amazon Rekognition and why should you care?
Amazon Rekognition is a deep-learning AI that can analyze videos and images for a variety of different applications. These include text recognition, image content analysis, the flagging of inappropriate content… and facial recognition for surveillance cameras. In a blog post
, Amazon claims that the system is capable of performing “real-time face searches against collections with tens of millions of faces… In security and safety applications, you can now identify people of interest against a collection of millions of faces in near real-time, enabling use cases such as timely and accurate crime prevention.”
The company isn’t shy about the potential dystopian applications for its software and lists the city of Orlando as one of its existing trial clients. As the ACLU pointed out
, there is the “possibility that those labeled suspicious by governments — such as undocumented immigrants or Black activists— will be seen as fair game for Rekognition surveillance.”
As we’ve seen in the Securus phone tracking scandal, there’s absolutely no reason to trust your government with sensitive and invasive data collection, since that data has been and can still be misused, abused, or carelessly exposed. Now, local governments that haven’t earned the public’s trust when it comes to data collection will find it even easier to profile citizens and track their every move.
Hold on, though – hasn’t facial recognition technology been around for a while? Why is this news? Well, the facial recognition that many law enforcement agencies have been using was only available for images or recorded videos. This means that, in the event of a crime, investigators would retrieve video or photos relevant to their investigation and then identify the individuals depicted. Amazon’s Rekognition service, however, promises to turn any camera using it into a real-time face-recognizing and person-tracking surveillance machine, expanding local governments’ surveillance capabilities massively.
Serious concerns with facial recognition accuracy
The passive use of real-time facial recognition to monitor the actions of innocent and suspect citizens 24/7 is a massive ethical issue and breach of trust and privacy, but the problem doesn’t end there. It is unclear how accurate Amazon’s Rekognition is (Amazon insists that it’s accurate), but the results from other live facial recognition systems have been absolutely abysmal. According to a report by Big Brother Watch
, a UK-based digital privacy watchdog organization, “on average, a staggering 95% of ‘matches’ wrongly identified innocent people.”
In its analysis of the South Wales police force, Big Brother Watch’s report said that the system significantly affected “31 innocent members of the public incorrectly identified by the system who were then asked to prove their identity and thus their innocence.”
In short, UK governments already testing and using technologies just like Amazon’s Rekognition are almost totally incapable of correctly identifying anyone – but that doesn’t stop them from trying to arrest people based on this highly inaccurate and invasive technology. At best, the adoption of systems like Amazon’s Rekognition will be a bumbling and incompetent abuse of government power that will arbitrarily endanger random citizens, expose data about citizens’ locations and actions to hackers, and allow criminals to slip through the cracks. At worst, it will evolve into an invasive and dystopian status quo in which citizens can expect their every single action to be closely analyzed by law enforcement organizations.