Apple vs. the FBI
Normally at BankVault, we try to keep our blogging and news limited to issues related to bank account hacking and endpoint security. But, there’s a fight currently underway in the United States between Apple and the US government that bears mentioning because, to paraphrase an old American statement about the power of General Motors, ‘As Apple goes so goes the country.’ And, as Apple goes, so goes the mobile world.
The issue at stake is whether or not the US government can force Apple to change the iPhone OS so that it provides the US with a backdoor key to iPhone encryption. The issue lands, as is so often the case, as a choice Americans must make between privacy and the promise of possibly-stronger anti-terrorism security. Since the San Bernadino shootings, US Federal law enforcement agencies have been petitioning Apple for *some* way to crack into the locked iPhone 5c of one of the shooters: Syed Farook. The law enforcement logic behind the requests is that Farook changed his iCloud password in the month or so preceding the attack, which suggests he didn’t want the FBI to have access to who he was communicating with and which websites he was visiting in the days leading up to the shootings.
Apple has worked with the FBI in the ensuing weeks and months, but the FBI has requested more. They requested Apple change the iPhone operating system to allow a brute force access to iPhone encryption. Brute force means it would be possible to crack the encryption through the use of supercomputing networks that try all available combinations of passcodes. (Right now the iPhone will erase itself if a user enters the wrong access code after 10 failed access attempts.)
“No way,” said Apple CEO Tim Cook posted in a passionate letter.
Ironically for US citizens, we’ve been here before. In 1996, the US government tried to do this very thing to ALL personal computers in the US that were using a very popular (and still quite good) encryption program made by Phil Zimmerman (a legend) called ‘Pretty Good Privacy’ – also known as PgP. Even then, at the beginning of the age of the internet, enough people knew enough to know that this request was a very bad idea.
From a government access point of view, creating that kind of back door into iPhones is very troubling. Backdoors are, by their very nature, non-discriminating. Once the backdoor exists ANYONE can knock.
Imagine what governments in countries that are even more lax about citizen privacy (ummm…China?) would do with it?
And then one should consider the for-profit criminal hacking community?
In 1996, opponents then like the Electronic Frontier Foundation (EFF) argued against the US government’s attempt to gain a backdoor to PgP. ‘Privacy is like Pandora’s box’ went the argument. ‘Once a government *has* a level of access it’s never long before they use it. ‘
More than 20 years later the Electronic Frontier Foundation is again fighting the US Gov’t on this issue. Their argument has a depth of detail and passion that we won’t attempt to replicate here. Read their blog post on the Apple encryption issue for yourself.
In the end, when considering where you stand on this issue each individual, even if they’re not a US citizen, has to decide between the following: lose the information that might have been gleaned from Syed Farook’s iPhone and the help it might have given in identifying self-radicalized terrorists going forward? OR risk making one of the world’s most popular devices the world’s most easily broken hacking target?
Everyone has to make that decision for themselves. If you need an additional example of how well governments work with technology read this story from CNET, “Apple says investigators ruined best way to access terrorist data“. – A backup feature might have provided the FBI with a way to access data from the iPhone of a San Bernardino terrorist. But a change to the Apple iCloud password foiled that idea.