Last Updated on February 5, 2022
Apple’s CEO issued a pointed statement rejecting a court order for what he considers an iPhone back door on privacy grounds.
In a dramatic statement, Apple CEO Tim Cook called it “an unprecedented step which threatens the security of our customers.”
He was writing about a court order demanding that Apple assist the FBI with hacking into the phone of belonging to San Bernardino gunman Syed Rizwan Farook as they investigate the December 2015 shooting that killed 14 people before police fatally shot Farook and his wife, Tashfeen Malik.
Cook’s position is that the request carries implications that reach far beyond this single case.
The White House and the FBI say the request concerns only this single phone.
Since September, 2014, Apple’s iPhone software has offered a data protection option that (once enabled by a user) automatically deletes all data on a phone after 10 unsuccessful password attempts. The feature was reportedly created after Edward Snowden released claims of government surveillance.
The BBC reported the government wants Apple to give them two things: they want the data erasure option deactivated and a faster way to enter every conceivable password possibility so they can get in faster.
In his statement, Cook framed the company’s position this way:
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
I am immediately reminded of a familiar old argument which essentially says that if you have nothing to hide, you shouldn’t object to not having privacy.
In the old days, this was probably much more true.
You could have no “dirty secrets” on any of your devices, no “adult” sites in your device’s browser history, and no personal photos that cross any line.
But these days, it’s other things on your phone — passwords, financial data, personal information — that could cause serious problems for an ever-growing number of potential victims of identity theft.
On one hand, it sounds like just another conspiracy theory to unfurl the “slippery slope” argument with regard to creating any method that would enable access to data.
On the other, what about a person arrested on charges of planning an attack that has not yet happened? What if investigators determine that critical information that would stop the attack might be on that person’s phone?
Yes, it’s always dangerous, in a way, to begin playing, “What if?”
But that’s exactly the position our country’s intelligence officials must play in the new world of terrorism that didn’t exist in this way a generation ago.
What’s the answer here? I suspect it comes down to this simple question: which is most important: your privacy or your safety?
We shouldn’t have to give up either.
But if one or the other might some day be at genuine risk, which would you most prefer giving up?