Here at CT we’re not big on posting about topics just because they’re happening. (Unless it’s the 6 Nations, obviously.) But this Apple FBI back door saga is making me feel I should post something, not because it’s topical, not because I know a lot more about it than anyone who reads a decent newspaper / tech journal etc. (because I don’t), but because it’s becoming clear that this event is morphing into something of a turning point in how governments interact with tech firms in the US and, at more of a distance, the UK.
(For a comprehensive and thought-provoking piece on governments and tech intermediaries, read Emily Taylor’s recent piece, The Privatization of Human Rights: Illusions of Consent, Automation and Neutrality, for Chatham House.)
I’m going to assume you know most of the facts and the larger repercussions, and just jot down a few observations of my own and that I’ve come across in various digital rights back channels.
The order would require Apple (US) to create firmware to be loaded onto a specific phone to make it possible to do brute force password guessing. (Among a couple of other things, it would take away the maximum number of guesses to unlock the device.)
The significant thing about this case is that the FBI, minus any enforcing legislation, has gone and found itself a judge to order a company to do something. Let’s take the second part of that first; ‘ordering a company to do something’, as there’s something arguably new in the current FBI approach.
The old late 1990s crypto wars were about governments compelling companies to provide decryption keys to their encryption services. Thankfully, the governments lost. Otherwise we would never have had the encryption-based technologies that made e-commerce (remember when we called it that?) possible. The interception and surveillance debates of the 2000’s have largely been about compelling companies to provide access to existing communications or data – albeit by forcing companies to re-engineer some business processes (e.g. billing, metadata retention) and technologies (‘black boxes’ for realtime intercepts). The current FBI order is an attempt to make a tech firm a) create entirely new firmware that is wholly designed to b) undermine the security of a core product, the iOS.
(Kieran has a great post on Apple’s interest in maintaining its hardware manufacturing business by protecting the walled garden of locked-in services that depend on it.)
A lot of the debate hinges – or would hinge, if there were to be a legislative process that tried to weigh up the competing interests and goals – on what ‘reasonable technical assistance’ to law enforcement is. The Apple case is remarkable in that it couches reasonable assistance as basically breaking your own products. (Something telcos have done for years, but that’s another day’s work.) Apple has quite rightly made the point that not only does this break company security and therefore customer privacy, but that if they create an exploit for the FBI, the vulnerability will be used by the likes of Putin and various repressive regimes.
Less obvious is that the public fact of Apple having done this for US law enforcement would put its employees in other countries at risk. ‘You helped them, why don’t you help us; we know where your daughter goes to school’ kind of thing. (This isn’t hypothetical. I know someone this happened to. If it’s a choice between turning off a chunk of the Internet or facing down the rather fidgety men who have arrived in your kitchen in the middle of the night wielding semi-automatic weapons, well, it’s not really a choice, is it?) So when Apple says it has designed and encrypted its OS “so it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8,” the audience for Apple’s stance isn’t wholly or perhaps even largely American.
This is why Apple has been publicly backed by other US-born but global tech firms Google, Twitter and Facebook. (And here I disagree somewhat with Kieran’s take – I think the support of other tech firms has been quite strong, though there’s also a slightly relieved flavour of ‘We’re right behind you….’ in it.) The FBI’s resorting to a court order is a rare enough instance of the USG suffering because of American dominance in Internet businesses. If the big tech firms were national champions and not much more, they would be far more amenable to the inter-personal backdoors that used to sort this kind of thing out without recourse to laws or court orders. But the very success of these companies means they have outgrown the bounds of a single state, even one as powerful as the USG. They simply cannot run a global business if they are seen to do too many special favours for one government.
Of course, this prompts the question why was a court order necessary?Presumably because Apple had not cooperated informally and behind closed doors. Why did they not cooperate informally and secretly? Either a) they sincerely wished to force a public row and expose the unconstitutionality of the order, or b) it is simply an elaborate branding exercise that will be quietly abandoned once the point is made, a private accommodation come to with law enforcement, and the media has moved on to something else. Exhibit A: Google. State Department. China.
I think there is a little more good faith on Apple’s side than option b allows for, but that principles that seem clear one moment get muddied as events move on.
In a world where what were individuals’ previously unremarked daily activities – walking around, taking public transport, buying things, talking to people – are now mediated through private firms, governments are as equally convinced that they must have this new information as they are frustrated by the fact that it is held by companies. The right to be let alone died long before the birth throes of the FitBit. Citizens are now just little motors chuntering around creating metadata exhaust trails. The current conflict is not an argument about our rights but rather a fight between governments and firms on how better to pin us down and hoover up the effluent. You can see why they might all be getting testy about who gets what.
Back to the earlier point about there being no legislation for the FBI to base its order on; this is really the whole point of the exercise from their point of view. They have not been able to get laws through the democratic process that will force companies to selectively break or weaken their products in this way. Hence the recourse to a court order based on a seemingly ancient writ that says if a judge orders something you must assume it is legal. This is troubling both from a political/democratic perspective and also a legal one.
Most of us in common law jurisdictions probably haven’t thought much about the different sorts of things courts can order versus what legislatures can impose, other than to fuzzily assume that what judges do must be based on specific laws. Of course, it’s actually more complicated than that. I am cheekily paraphrasing David Allen Green, here (who some of you know as @jackofkent), who says on a list I subscribe to that if you are the subject of a court order, you should typically still have recourse to some statutory scheme that allows you to challenge or comply with the order. Court orders – in the UK at least – will normally only make you hand over something you’re legally obliged to provide or stop you doing something you shouldn’t do. They really are not designed to force you to actively do something you are not already legally obliged to do, in some statute somewhere.
That is the general common law approach, and the US tech firms must be quite confident in the legal argument behind their claims that the Apple order is unconstitutional. But if the FBI can set a US legal precedent by rolling over one of the two most powerful tech firms in the country/world, then it will have less need of legislation because everyone else will just do as they are quietly asked. This, presumably, is why Apple is publicising and forcing the issue; to prevent a slippery slope of private enforcement actions by anyone with an eagle-adorned badge, to potentially force a Supreme Court case that will both clarify and constrain its obligations, and to protect its interests overseas.
Couple of final points.
This could NEVER happen in the UK. Why? Because the Investigatory Powers Bill both requires an overly broad base for ‘reasonable assistance’ and accompanies it with a gagging order. (Long experience has taught me there is always a good reason for what first appears to be sloppy drafting in a Home Office bill.) So, unlike in the US, there will be no chance here of a Vodafone executive publicly refusing to actively assist government hackers once this bill has passed. The old gag about the Snowden revelations plays out once again; it was oddly funny that wide-scale surveillance was able to happen in the US illegally, and in the UK almost wholly legally.
Finally, putting aside the various (and irrelevant in this case where the crime has been committed and the perpetrators are dead) 24-style ticking bomb arguments for the necessity of government coercion, there really is an important distinction between requiring a company or person to disclose something they already have/know, and enforcing their active assistance. Aside from Apple, encryption and the FBI, that may be the most important precedent governments are currently trying to set, whether by case law or dodgy statute.