Apple Stance has Public Divided on Issue of Privacy Versus Security

4098238845_aac2940f0e_bThe issue of privacy versus security is once again dividing the public thanks to the decision by Apple to oppose a judge’s order to unlock the iPhone 5c of Syed Rizwan Farook, one of the two people implicated in the San Bernardino shooting last year.

For the public, the issue revolves around trust of the government, the fear of terrorism, and the need to maintain at least some level of privacy. For Apple, privacy and security are a major selling point, something that the company probably fears will be lost of every cellphone maker willingly provides the government a “backdoor” to their devices.

This is not an issue in many countries where the idea that a citizen has a right to privacy is generally accepted. What is a surprise to some, however, is that this right is not enshrined in the US Constitution either, only elements of it such as the Fourth Amendment of the Bill of Rights:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

As the government secured a warrant to search Farook’s iPhone (which was the property of his workplace, in any case), the issue really isn’t about the government having the right to access the data in the phone – that is established, they do. But the issue is Apple building a so-called backdoor for the government.

This is an immensely complicated thing for most of the public, many of whom have formed an opinion based simply on the issue of privacy versus security.

But first, let’s look at what the judge is actually ordering Apple to do: develop and load a new version of the iPhone’s operating system into this particular iPhone, one that would allow authorities to use its computers to quickly guess Farook’s passcode without being locked out before reaching the allowable number of wrong guesses.

This seems reasonable… until one realizes that once it is known that Apple has this software and can load it into any iPhone, then authorities can ask for it to be installed again.

“This is a new situation in new circumstances,” said Christopher Budd, global threat communications manager, Trend Micro. “Both parties have good and valid arguments. It’s the ongoing tension between individual privacy and collective security.”

One could reasonably say that many factors have to be lined up before any new software solution was ever used: the iPhone would be in the possession of the authorities, and a warrant would be in hand, for instance. If the subject of the investigation were still alive they would, one assumes, be able to challenge the use of the software in court.

But does the government have the right to order a company to develop a piece of software? Apple says “no” but if this case proceeds on it might find that the government has picked the perfect case to test this. Farouk is dead, and the owner of the iPhone is cooperating and has no objections to the phone being unlocked. Further, many citizens are unsympathetic to a company not cooperating in an effort to halt terrorism, no matter what privacy issues may be involved. After all, there is a large percentage of the US population supporting a candidate for President that advocates that the government needs to increase its use of torture.

And as lawyers have pointed out on Twitter and elsewhere, this case also has first amendment implications in that it would require Apple to create and publish code to accomplish the task set in the judge’s order.

Maybe Apple knows it will lose this, and is willing to have this case go all the way up to the Supreme Court in order to show customers they are fighting for their privacy. This case is a good one for them, too. After all, there is no immediately threat (though some might argue that any time wasted discovering Farook’s contact might be seen as creating a threat).

Apple now has five days to respond, then another ruling will likely it force it comply. Apple will then appeal to the next level.

In comment threads on the major newspaper website readers are arguing both sides of this issue. As far as I can tell opinion is pretty evenly split: many simply do not trust the government and want their devices secure, no matter the cost; others have adopted what might be called the “if you’ve not done anything wrong you have nothing to worry about” position. Still others uses the “hidden nuclear bomb” argument to defend the government’s position.

reposted with permission from TNM 

image by mattbuchanan

D. B. Hebbard

View posts by D. B. Hebbard
Douglas Hebbard (or if you are using D.B. Hebbard use that) is a 30+ year veteran of the newspaper and magazine publishing business, and has been publisher of the digital publishing website Talking New Media since 2010.


  1. fahirsch18 February, 2016

    There is no security without privacy.
    And the biggest danger to Liberty are Governments. True in 1789, even more in 2016

  2. Jason van Gumster18 February, 2016

    This isn’t just a government trust matter. It’s a practical matter. If you purposefully introduce a vulnerability, then that vulnerability is available to anyone… regardless of whomever that “feature” was actually developed for. There’s even recent evidence of this in the discovery of backdoors in routers and firewalls from Cisco, Juniper, and Fortinet. And on network-enabled devices (including phones), exploited vulnerabilities in individual devices can serve as an attack vector on more protected hardware/infrastructure by way of botnets.

    So the framing of the argument here is wrong. It’s not privacy vs. security. It’s all about security. Anyone advocating a backdoor of any sort has a much more short-sighted view of general security and possibly a lack of understanding as it pertains to computer security.

  3. jjj18 February, 2016

    Lol they don’t get it at all.
    It’s not security vs privacy. It’s security of all iphone users vs the FBI desire to acquire some info. If anything,a backdoor is a big national security risk and the US Gov should be starter than to want one.
    This is not about the gov, it’s about Apple claiming that it would make the iphone less secure,in general. Problem is that ,it’s not true. The backdoor is already there, they built it and they are only being asked to exploit it. The device is already not secure, the existence of the backdoor makes it that, exploiting the backdoor doesn’t really change the level of security offered.
    Plus, the mandatory kill switch is already a gov mandated back door.Different kind of backdoor and one that some better informed users can disable but a gov mandated backdoor nonetheless.
    Apple will lose and if they really want a secure device they need to stop sacrificing security for convenience. They shouldn’t be able to modify the software without the user’s consent, they shouldn’t ship the kill switch feature enabled by default and they should drop any biometrics.Otherwise, reasonable security can’t be achieved.


Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to top