[Before we start, I want to acknowledge that people that know me know that I have no particular love for Apple in general and that my defense of their position here could hardly be argued as simple fanboyism.]
Apple is in the news recently regarding their refusal to go to unusual lengths to provide access to the contents of an iPhone. One of the most interesting bits is that Apple is not saying that what they’re being asked to do is technically impossible. They are explicitly stating that they won’t do it–not that they can’t do it.
This is an interesting distinction partly because it concedes the point that many have made for years that if Apple wanted to access any user’s device they could, but it’s a matter of necessary effort. Apple gets credit for having intentionally made it harder for them to access a user’s device, but so long as they control the hardware, the software, the update service, and the “cloud” storage it seems unlikely they will choose to reach a point where they have no technical ability to access a user’s device [though there are now rumors circulating that they will, we’ll see]. Creating a device that they can’t access essentially requires that they give up the ability to automatically push software updates without resetting the phone to factory conditions [that is, a user would have to first unlock the device and provide some interaction for the update to happen].
But that’s more of a side note to the main event here. From my perspective there is really one main point and a corollary that need to be discussed:
1. How much effort should a company be required to expend to comply with law enforcement desires?
1.a. Should companies be required to design products to be accessible by law enforcement (i.e., mandatory backdoors)?
Note: Whatever answer you provide will be used by our and other governments to demand the same cooperation for future cases.
And why would anyone use or buy backdoored products when actually secure tools are readily available (a backdoor is, by definition, a security vulnerability)?
Many people see Apple as being obstructionist in their stand. After all we’re talking about a phone used by someone who thought shooting a bunch of people was a good idea. And they know that. I imagine that Tim Cook is thinking a lot about this quote (generally attributed to H.L. Mencken):
The trouble with fighting for human freedom is that one spends most of one’s time defending scoundrels. For it is against scoundrels that oppressive laws are first aimed, and oppression must be stopped at the beginning if it is to be stopped at all.
Apple is very much relying on a slippery-slope argument. For a slippery-slope argument to be valid you need to show a basis for the idea that if you provide A then soon you’ll be asked for B and then C and then D and then E. Often times slippery-slope arguments are made fallaciously–providing support is necessary to turn it into a valid argument.
I think Apple can make a convincing slippery-slope argument here because of the long history of law enforcement, and the executive branch of the government in general, demanding more and more information of lesser and lesser importance almost for no other reason than that the information is available. To the point that in recent history illegal wiretapping scooped up data on every American phone call for years, just because they could.
Apple can also show that other countries have and will demand that Apple provide access to devices for reasons that Americans would find unacceptable (religion, sexual orientation, political views, etc.).
Apple’s argument is that the only way to guarantee civil liberties is not to make violating them illegal, but to make violating them impossible. It’s a very strong stand.
You, as an American citizen, have to think about this and decide where the line is. Too many people make knee-jerk decisions without spending any time to consider the continuum of possibilities and outcomes.