Apple vs. The FBI

A court order can only do so much…

By now I’m sure most of you have heard about the debacle between the FBI and Apple.

Here’s a brief summary before I explain the ramifications:

The FBI has the San Bernardino shooter’s iPhone 5C but they can’t get into it because it’s locked behind a passcode. The passcode is used to key device-wide encryption so they can’t just remove the flash RAM and read it – what they read would be indecipherable random noise.

They also can’t brute force the passcode because Apple phones have a feature (which can be enabled by the user) that wipes the phone clean after a certain number of wrong guesses. That feature may be enabled on this phone so they can’t risk it.

What the FBI has asked Apple to do is create a purposely compromised version of their operating system that allows unlimited guesses, does nothing to delay each guess and allows guesses to be submitted over the Lightning jack and possibly through Bluetooth or WiFi. They then want Apple to sign this compromised OS image with Apple’s private software distribution key (so that the phone accepts it) and boot the phone with it so a law enforcement agent can brute force crack the passcode.

The legal issue: the FBI has a court order permitting them access to this specific iPhone and ordering Apple to provide it as described above. Apple has the technical ability to fulfill this request but is refusing to do so on principle. This is somewhat akin to the FBI having permission to attempt to crack a bank vault but Apple won’t let them in the door.

Why?

Tim Cook’s open message misrepresents a few of the technical issues but his legal argument is very sound: Apple will probably challenge this all the way to the Supreme Court but if they lose and comply with this order the legal precedent will be set and the government will have the (potential) ability to order this procedure done with other phones.

Granted, that’s a slippery slope argument, but I’ve got another one for you: this newly acquired ability will quickly become pointless as terrorists begin using their own custom software to provide end-to-end encryption of communications. This type of software can be written by a single developer with a smidgen of crypto competency and decent computer science savvy. The necessary encryption algorithms are freely available in open source libraries.

This loops us back to Tim Cook’s original point, but not in the way he was thinking: once Apple designs a backdooring process it will become increasingly useful only for nefarious people who want access to the phone’s normal features – the ones you and I use (banking, paying with Apple Pay, emails, SMS messages, Facebook, etc.) Any decently sophisticated criminal or terrorist organization would have nothing to fear after word spread and solutions were developed. If you’re the paranoid type, it would also enable the government to more easily spy on you.

There’s one more complication here: nothing is stopping Apple from building in hardware that makes it intractable for them to override the passcode behavior in the future. Indeed the only reason a software solution is somewhat viable now is because the 5C uses the older A6 chip. The A7 and up (used in all later iPhone models) already enforces an escalating delay after each wrong guess.

Bottom line: this is not something Apple can be reasonably asked to do going forward. Building in a backdoor from the factory is absolutely unacceptable for the obvious reasons. I really don’t know what law enforcement can do about this – good, well implemented encryption is impervious regardless of what resources are applied to cracking it, barring specific mathematical discoveries and dramatic increases in computing power.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s