Locked iPhones and the FBI
Thinking
After the assassination attempt on former President Trump earlier this month, after which the shooter was himself killed by law enforcement, I was waiting for the FBI to complain that it couldn’t access the shooter’s locked iPhone—but that didn’t really happen. It was mentioned in passing in some of the initial coverage after the event that the field office couldn’t unlock it, but with the expectation that the FBI’s national experts in Quantico could tackle it without much fuss. They did, within two days.
All of this played out in the context of many decades of debates between proponents of strong consumer encryption and hawks within law enforcement and government who have sought to restrict the technology. In 2016, the specific conundrum of locked iPhones made national news when the FBI tried to force Apple to unlock the device of a suspected domestic terrorist in San Bernardino. Apple refused, arguing that not only could it not unlock the phone but that to cooperate would undermine the security of all of its customers. This remains the general outlook of most of the field of computer security, in both industry and academia: adding methods for “exceptional access” to encrypted information is inherently insecure. The FBI ended up dropping the case against Apple after spending a hefty sum of money to use a third-party phone-cracking tool to access the device.
Given all of that, it was a bit surprising how little of a news item the shooter’s iPhone became. Some (if not most) of that can be explained by the news cycle immediate pivoting to Trump’s triumphant RNC and then to Kamala Harris’s triumphant ascendance to the top of the Democratic ticket.
But there are two other big factors at play here, both of which law enforcement boosters have previously not much wanted to discuss. The much bigger security angle of the assassination attempt was (obviously) the Secret Service’s abysmal lapse in its perimeter and the failure of local law enforcement to confront the shooter in time.
Civil liberties advocates have repeatedly pointed out that mandating encryption back doors does not magically solve the problem of competent law enforcement or criminal investigations. This case, where the encryption became a problem only after the fact, casts a bright light on that disconnect. The Secret Service director has since resigned in disgrace, with the agency taking much of the blame for the lapse.
The other factor is the speed and availability of the tools that agencies like the FBI can use to crack open locked devices. After the public legal tussle over the San Bernardino shooter’s iPhone, it became clear that Apple would not willingly cooperate with law enforcement, and that other third-party tools could do the job in exchange for large sums of cash. The ensuing years seems to have been enough for those tools to become widespread enough that it was merely a question of fast-shipping an iPhone to a secure lab to use a tool that’s already been purchased, rather than jumping through legal hoops in order to justify a huge one-time cost.
Computer security can never be guaranteed. Cryptology has always been a give-and-take between codemaking and codebreaking, much like their metaphorical equivalents, locksmithing and -picking. This is why panicky claims about law enforcement “going dark” are so disingenuous. Today’s encryption schemes would have completely bamboozled the crack codebreakers at Bletchley Park, and most cryptogram hobbyists can break pen-and-paper ciphers from the Renaissance. But technology and society both keep moving, bringing both defenders and attackers along with it, constantly reshaping nearly everything about the adversarial security landscape. What seems like “going dark” one year seems like a simple two-day delay less than a decade later.
As an American academic advised the NSA in 1981 while opposing the idea that the agency should impose limits on encryption research: "Stay ahead of others."1
Reading
If you, like me, have neither the time nor energy to keep up with the latest hijinks from the crypto bros & friends, Molly White’s Citation Needed exhaustively covers it for you! In particular, look for the Craig White update, if you, like me, enjoy the occasional foray into the question of who is Bitcoin inventor Satoshi Nakamoto? (answer: not Craig White)2
Molly White also just released a project tracking the cryptocurrency industry’s 2024 US election spending, which is large despite crypto not actually being much of a median voter issue.
1 Davida, George I. "The Case Against Restraints on Non- Governmental Research in Cryptography." Cryptologia 5, no. 3 (1981): 143-48
2 Explaining this reference would take an entire newsletter or three. You can definitely Google it but maybe I’ll write an issue on this later because it is a really delightful ongoing saga.