Tim Cook has said he will challenge a judge’s order to help the FBI unlock an iPhone belonging to Syed Rizwan Farook, who with his wife, Tashfeen Malik, shot dead 14 people in San Bernardino, California, in December. Mr Cook, Apple’s boss, said the ruling had “implications far beyond the legal case at hand”. The government, he said, “is asking Apple to hack our own users.”
This is post-Snowden corporate zeal to maintain customer privacy. Is it misplaced in this particular instance, when a crime has been committed?
The US Justice Department has sought Apple’s help. A judge, on Tuesday, asked Apple to disable the feature that wipes the data on the phone after 10 incorrect tries at entering a password. That’s all. The judge did not for Apple to break the encryption.
But Apple says it can’t unlock newer iPhones even when officers obtain a warrant, because it does not hold the decryption key. Only someone who knows the password can do so and 10 incorrect tries will automatically erase all the data on the phone.
Apple is being asked to write software to bypass the feature, which would only be used for Syed Rizwan Farook’s phone.
But Mr Cook describes such compliance as dangerous and “the equivalent of a master key, capable of opening hundreds of millions of locks.”
The answer, he seemed to suggest, is not to create it at all.
How about a fairytale solution instead? Create the “magic key” then lock it away in a dungeon guarded by fierce fire-eating dragons? Only the prince can have it and then, only if he asks three questions correctly. Naturally, it is the questions that will be crucial. They have to ensure that whoever wants access to the key must be legitimate, must have a legitimate purpose and knows that the key will disintegrate if it is not returned to the dungeon within a specified timeframe.