Why does Apple want to backdoor users’ iPhones?
Many consider Apple to be strong on user data privacy and security. Which is why their proposal to backdoor and scan all users’ iPhones has surprised and dismayed so many. Why are they doing this? Is backdoor access to your private files ever OK?
Why does Apple want to backdoor iPhones?
First, what do we mean by “backdoor” here? Apple wants to be able to scan the photo library of every single person using an iPhone. To view their photos and scan their contents or to compare them to databases of other images they have access to.
The reason they have provided for doing this certainly is noble. These Expanded Protections for Children are part of a toolset Apple wants to use to combat the abuse and sexual exploitation of children.
Specifically, the backdoor access would let Apple scan all users’ photos and compare them to a database of known images of child exploitation or sexual abuse. Matches would be reported to the national organizations responsible for monitoring these sorts of crimes.
So why is Apple receiving so much criticism for what seems like a good idea?
Why backdoors are a problem
It is no accident that when backdoors are proposed, they are often first suggested as measures to combat the distribution of child pornography. Child abuse is an emotionally wrenching crime against the most vulnerable and innocent members of society. This is an important fight.
However, it can also be used as a Trojan horse – an attractive and righteous cause used as cover to achieve something more sinister. Here’s why critics fear this might be a way for surveillance advocates to get their foot in the door:
- Encryption backdoors are a juicy prize: Encryption backdoors would be a big win for any proponent of online surveillance. We live our lives online and in the company of our smartphones. By undermining the core technologies that guarantee online privacy, a government would be able to monitor practically everything that we do.
- Backdoors have broad applications: In theory, backdoors like this should only be used for a singular purpose (in this case, to catch child abusers). But in practice, it is then very simple to expand that purpose to practically anything else. Once the backdoor has been built, the government can later decide (publicly or in secret) to use it for things you may not approve of.
But let’s say you completely trust Apple and the government. And you also support the fight against all forms of child abuse (as we all should). There are still problems.
What’s more secure, a door or a wall?
If you imagine encryption as a tunnel, a backdoor is the introduction of a door where there had previously been an impregnable (encrypted) wall. It introduces a vulnerability where one never existed and where, arguably, it shouldn’t.
From a long history of cybersecurity blunders (one of the largest and most recent being the SolarWinds hack), we know that government cybersecurity is far from watertight. How sure can you be that the government won’t ever lose the keys to that door in your tunnel if they’ve lost the keys to their own doors before?
Setting a precedent. If a precedent is set for encryption backdoors and surveillance, surveillance advocates will want to introduce them elsewhere. This goes beyond Apple’s products. If Apple normalizes encryption backdoors, it will be harder for opponents to fight back when backdoors are demanded in Android phones or other devices.
Is Apple still dedicated to security and privacy?
We applaud Apple’s interest in assisting in the fight against child abuse. However, our fear is that a tool like this may do more damage than good. Apple is valued by many for valuing the security and privacy of their users (though they are far from perfect). Hopefully, they’ll continue to do so by reconsidering their interest in building backdoors for their products.
Want to read more like this?
Get the latest news and tips from NordVPN.