back to article Why Tim Cook is wrong: A privacy advocate's view

Apple has recently released an open letter explaining why it will challenge a judicial order requiring it to hack the iPhone of one of the accused San Bernardino terrorists. As someone who believes in individual civil liberties and personal privacy above nearly all other considerations, my first instinct is to applaud Apple. …

Page:

  1. Anonymous Coward
    Anonymous Coward

    1. All back-door schemes reduce to: Send a plaintext copy of all communications to the government.

    2. All crypto schemes should assume the hash will be stolen for offline cracking by a state-level actor. Key-stretch accordingly.

    3. Don't buy American is already a good principle.

    1. MyffyW Silver badge

      The article seems to suggest that given the common good is served by knowing about this flaw and we can therefore agitate for a patch to it, the boys in blue should be allowed to use it, just this once, because the owner is suspected of being particularly unpleasant.

      I still feel my size 7s losing traction on that a slippery slope.

    2. asdf

      except

      >Don't buy American is already a good principle.

      Except the iPhone security is front and center as being pretty solid what with the Feds throwing a fit. Do you really think buying an Xiaomi android because its not American will make you safer (hint: Android full phone encryption tends to be garbage which is why Google backed off requiring it). Yes the US government sucks but they are incompetent enough to let the world know what they are up to unlike many other governments.

      1. Anonymous Coward
        Anonymous Coward

        Re: except

        How often do you travel to China, how often do you travel to the USA?

        How often do you compete with US owned companies vs Chinese companies?

        I don't care if the People's Patriotic Army have a blakc box they can connect my Xiaomi phone to at a security checkpoint becuase I don't go to China and China doesn't make jet engines.

        Having a black box that the TSA can routinely run all ariving foreigners phones through - is an issue to me, and my employer.

        1. Anonymous Coward
          Anonymous Coward

          Re: except

          and more than likely the three letter government agencies will be able to get at what's on your Xiaomi phone much easier than an iPhone regardless of what country you visit. You notice the FBI doesn't single out Google hardly at all when ranting (if nothing else due to a very large percentage of Android phones never being encrypted).

        2. midcapwarrior

          Re: except

          Except they actually do make jet engines

          1. asdf

            Re: except

            with parts probably from 25 other countries. But keep believing not buying American or only buying from UK companies makes a difference in this era of multinationals picking where they pay taxes and suing governments for harming their shareholders.

            1. DrBobMatthews

              Re: except

              Oh please! So you thing that Boeing doesn't use parts sourced from companies other than US

              companies! am not a medical Doctor but I have the feelingt hat your severe case of myopia , ignorance and stupidity rules you out of making a valid comment.

            2. DrBobMatthews

              Re: except

              Oh please! So you think that Boeing doesn't use parts sourced from companies other than US

              companies! am not a medical Doctor but I have the feeling that your severe case of myopia , ignorance and stupidity rules you out of making a valid comment.

        3. cosmogoblin
          Black Helicopters

          Re: except

          Is it paranoid to wonder, when it takes my phone/laptop an extra 30 seconds to get through security compared with everybody else's, if they have installed keylogging hardware?

        4. Graham Triggs

          Re: except

          Do you regularly send emails that would give the TSA a reason to consider you a risk to board a plane?

          I have a really hard time with arguments about how bad it is for the FBI / TSA / etc. to have access to your personal details, because of what they might do - no law abiding citizen should have anything to fear from that.

          However, that doesn't mean that the FBI / TSA / etc. should have access to this information, because

          a) Any backdoor that allows the authorities in is immediately a vulnerability that could be exploited by *anyone*

          b) Can you be certain that the authorities systems won't be breached?

          It is *impossible* to protect the human rights of law abiding citizens (you know, the things like keeping them safe from being murdered), if you don't obtain information that "the bad guys" thought they were keeping private (whether that's because of their own deficiencies, or a deliberate breach).

          But proportionality and accountability are important. What the FBI wants is not a consequence-free solution, it creates it's own problems and we have to be honest about that.

          1. 6th

            Re: except

            Whatever. With online backups - just make sure you have factory reset your devices before handing it over to authorities, walk out of the airport etc and reconnect to your profile.

            You can get me to login to my online account over hot coals. If it's insisted you log in to your account - have an empty or dummy account ready - just for them.

            If the agencies are going to treat adults with such contempt to make us out like children - I see no reason why we cannot play the game.

            They started it...

            1. Paul Smith

              Re: except

              Online backups on US owned, US based or US controlled servers are just saving TSA the hassle of hacking your device.

          2. Captain DaFt

            Re: except

            "no law abiding citizen should have anything to fear from that."

            That's already taken care of, There are no law abiding citizens in the US.

          3. Paul Smith

            Re: except

            I had a quick glance at the US constitution and I couldn’t see anything about being protected from murder. Nor did I see anything to gives potential or unnamed victims supremacy over my rights. In fact, the fourth amendment was pretty explicit about what must be done before my rights can be violated and stopping a murder (which this case is not about) is not on the list.

      2. KeithR

        Re: except

        "Yes the US government sucks but they are incompetent enough to let the world know what they are up to unlike many other governments"

        You don't seriously believe that, do you?

      3. Anonymous Coward
        Anonymous Coward

        Re: Xiaomi android because its not American will make you safer

        No. Android is developed by an American Company and without play services is supposedly pretty limited. (According to BlackBerry, hence flushing their own system, that runs non play service Android Apps rather well, down the priv.)

        BB10 would make you safer, but it looks like it will be going the same way as Symbian (also not American.)

    3. ma1010
      Unhappy

      So "Don't Buy American"?

      Yes, the US government, flogged on by assholes like Diane Feinstein (Senator representing the NSA and various other elitist bastards), certainly wants nobody (except her very important self and a few cronies, of course) to have any privacy - or any other rights of any kind, for that matter.

      It's a really good thing that, on the other side of the pond, there's nothing like a Snooper's Charter or any organization like GCHQ that snoops on everyone's traffic. Brits are so lucky to be free of that sort of thing! Not to mention other countries, such as France, where the government would never snoop on their own citizens. Well, not any more than China or Saudi Arabia does, anyway.

      I do strongly agree with the author of this article, but I can't imagine anything like that ever happening. If Apple (or anyone else) came out with really unbreakable encryption, they'd just outlaw it and throw anyone who used it into the Gulag. And that goes on either side of the pond, and in either hemisphere, too. Unfortunately.

      1. DrBobMatthews

        Re: So "Don't Buy American"?

        I trust none of the so called security agencies, least of al the NSA who seem to think that destabilising an elected foreign governmnent by the use of paid thugs to create riots all financed by the State Department is legitemate and legal.This action was of course supported by the whimpering jellyfish of a UK government who have to ask the US permission to wipe their own backsides.

        No longer do the USA and the UK governments represent freedom, they are both quick to condemn other states for their short comings but con their own populations by selling them the lie of securitry.

        Without the this big con, the US and UK arms makers and dealers would go bust.

        Why is the US State Department so worried about a single guy holed up in the Chilean Embassy in London? Are they terrified he might just release even more truths about the us, uk, nato and the dodgy nuclear deals with Israel? or are they afraid that the truth might be revealed regarding which countries missile brought down a civilian airliner? History tells us that the US is very adept at downing civilian airliners and decorating those responsible. Of the major powers, the US and the UK are not to be trusted, neither for that matter are the USSR, Israel or Saudi Arabia they are all "goverened by melagomaniacs.

  2. Alister

    Either way, it would appear to your correspondent that Apple screwed up when designing this device and it left open a means of attack. The judge is asking Apple to use its expertise to exploit this flaw. It's as simple as that.

    As far as I am concerned, the judge is in the right here. Apple is not being ordered to create a flaw and distribute it to all devices. It is not being prevented from fixing this flaw in future devices. It is being asked to exploit a flaw that currently exists, and for the privacy-conscious this is actually a good thing.

    I'm sorry Trevor, but I think you may be extrapolating too far here.

    The judge has asked Apple to assist as you say, by disabling the 10-strikes-and-you're-out mechanism on the PIN, but I don't see that this means there is necessarily a flaw which allows that, it may just be wishful thinking on the part of the judge, much as the requests to provide "encryption which is breakable only by governments" which have sprung from clueless officials on both sides of the Atlantic.

    1. Dan 55 Silver badge

      Search "How Apple's encryption works" in this text...

      http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-your-iphone.html

      The secure enclave enforces a five-second delay between failed PIN requests, it can't return the encryption keys, and they'd have to flash new firmware onto the running device into the first place, without the device's co-operation.

      1. Anonymous Coward
        Anonymous Coward

        How does the secure enclave times the five seconds? What if you make those five "seconds" last five ms or even less?

        1. Dan 55 Silver badge

          By overclocking the device by several orders of magnitude? It'd crash the CPU I expect.

          1. Danny 14

            aye but

            All code cant be altered or bypassed and apple might know how. Or they might know how to clone an identical phone or clone to a VM; 10000 combinations isnt a lot so a 1000 vms will be needed for 10 tries each.

            The fbi has gone to a court to get apple help it with one device. There is nothing of the fbi wanting a permanent method for all uphones.

            1. Doctor Syntax Silver badge

              Re: aye but

              "The fbi has gone to a court to get apple help it with one device. There is nothing of the fbi wanting a permanent method for all uphones."

              There's a good deal of wanting to use an unusual set of circumstances to set a precedent for lots of other circumstances.

            2. Danny 2

              Re: aye but

              The correct phrase is 'Aye, right'. Nobody says "Aye, but". Not once, not ever. Are you trying to discredit me and the presumably other 12 or so Dannys here. What a daft post. Utter mince.

            3. David Walker

              Re: aye but

              Its all just speculation so... If it was simple design to exploit the FBI would do it and not need Apple. Tim's email seems to imply that there needs to be hardware produced in combination with software and that it could be used on other devices. Probably a firmware or hardware certificate only known to Apple is needed and this combination might apply to a range of production units. Even a demonstration in principle could create a huge legal battle for Apple. I'm sure the army of "they should do it for god and country and to get those nasty terrorists" include a few ambulance chasers that would immediately class action sue the company for falsely claiming they can't access data when the FBI will very publicly announce that they did. You see in America suing is a national pastime. Hell you can sue someone for saving your life using CPR if the compression technique that was used leaves a boo-boo. A lobby group funded by trial-judges helped defeat laws to reduce patent troll cases - how dare someone ruin that fun. A woman sued 'cause her hot coffee was too hot - and won. So this case is just one of a string of cases that will provide entertainment and lobby groups $$ for years to come. Oh and if it does get pushed up to the Supreme Court - good luck politics will keep them out of commission for at least 1.5 years. Well time to make some popcorn and turn on C-SPAN its going to be a long long night.

      2. calmeilles

        It seems that the phone in question is a 5c and doesn't have the secure enclave.

        https://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/

      3. Pseu Donyme

        > http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-your-iphone.html

        Thanks for the link! :)

        The article explains why Apple can in fact decrypt an iPhone given physical access. A simplified version seems to be that the AES keys are ultimately derived from the passcode and a 256-bit unique per chip key (UID) baked in the core SoC (A6 in the case of a 5c) at the time of manufacture. The UID is not accessible to software as such, but can be fed to the AES hardware via an internal hardware path. This means that firmware could brute force the derived key(s) by going trough the passcodes using the AES hardware and seeing what decrypts. For simple/short keys at least this is quite feasible: using the 80 ms per iteration from the above article a 4 digit code space would be completely covered in 800 s.

        The 5c / A6 does not have the 'secure enclave' so this is not a consideration for the case at hand, but since the code running there is also a part of the firmware provided by Apple this wouldn't seem to make a difference for the newer models (from 5s / A7 onward) with it. Also, apparently, the ten try limit and the increasing delay between tries are just firmware features.

        1. Anonymous Coward
          Anonymous Coward

          As I understand it, the 10 try limit is enforced by the secure enclave on the 5S and newer. So this trick would only work on a 5c. But I doubt the Feds would appreciate the distinction and still ask Apple to find a way to break into a newer phone next time.

    2. gumbril

      I don't think he is, or at least if he is, then it's because Apple is not giving the obvious defence. That being "We can't", instead they are saying "We shouldn't". Maybe after "We shouldn't" is ruled on they come out with "We can't"

      Now it maybe Apple wan't to blow the "We shouldn't" debate open, which is happening, good, but I'd have thought that Apple would have to bring all their objections to the table in one go, court's tend to get miffed if things are left out, which you knew at the time, are relevant, and are only released piecemeal.

    3. This post has been deleted by its author

    4. Joe Gurman

      Amen

      I believe Mr. Potts is attempting to make a distinction without a difference.

    5. Adam 52 Silver badge

      There is an important legal question. The FBI are asking Apple to do work. Rather than hiring them (or a specialist security firm) and paying commercial rates they're trying to use a court order. There does need to be legal clarity on how much a judge can order - can he order Apple to devote an engineer for 1 week, 10 engineers for 10 years, 100 engineers for 100 years? Do they have to be the best available?

    6. Bob Dole (tm)

      Consider that data centers spend an absurd amount of money on physical security for database servers. The reason is that once a device is physically compromised it is only a matter of time for the contents of that device to be decrypted. I don't know why people think hand held devices are any different. No one can protect a device once it is physically on the hands of someone else. If they could, data centers could dispense with a huge expense.

      The PIN code issue is a red herring. The FBI can copy the memory of the phone without turning it on. Once the memory is copied a simple scripted back can be leveraged to find the right pin code. There's only 10,000 possibilities so this isn't even time consuming. So that's not what this is about.

      They aren't directly asking Apple to put a back door in. The approach is to force a company to develop the tools to compromise their own kit. Why? So the FBI doesn't have to do it for Microsoft, Cisco, Samsung or anyone else's devices. It costs money to keep programmers employed.

      So if the fbi can use the court system to essentially enact a law to force companies to provide a way to decrypt their own kit the natural thing businesses will do is program one in the future. This is how businesses work: if forced to comply with dorm restriction they'll take the easiest path to compliance. Also because congress has so far backed away from enacting laws forcing compliance it's the only way the fbi is going to get what they want.

      This whole thing is deplorable and should be fought tooth and nail at every step. Kudos to Apple for doing so. I think I'll buy another iPad to support them.

      1. Anonymous Coward
        Anonymous Coward

        Copying the memory will not work

        The phone's flash isn't encrypted by the PIN number, it is encrypted by a 128 bit (or maybe it is 256 bit) AES key which is unlocked by the PIN - though since this is a 5C that doesn't have the secure enclave I'm not sure how that part works.

        Either way if it was as simple as copying the flash and trying only 10,000 ways to decrypt it the FBI would have done that without even asking Apple.

    7. micheal

      disabling the 10-strikes-and-you're-out mechanism on the PIN

      But if Apple can do it, so can the hackers....

      I'd rather Apple tried / succeeded / fixed it, they even save on the "bug bounty" cost....what's not to like

      1. Roland6 Silver badge

        Re: disabling the 10-strikes-and-you're-out mechanism on the PIN

        But if Apple can do it, so can the hackers...

        If the hackers can do it, Apple have lost control of their private keys used to sign iOS updates...

    8. Sproggit

      Specious At Best, Wrong At Worst

      When Trevor writes, "Apple is wrong is in saying that the FBI is asking for a backdoor. It isn't. ", he is misrepresenting the facts as I've seen them reported.

      My understanding is: the Apple iPhone in question has been locked using the integral PIN locking mechanism. This has 4 digits and therefore 10000 combinations [0000-9999]. It also has a mechanism such that if someone enters the wrong value 10 times, the phone will wipe it's data. What the FBI are asking for, however, is a mechanism to obviate the "10 strikes" rule built in to the iPhone.

      So let's go back.

      Trevor wrote, "Apple is wrong is in saying that the FBI is asking for a backdoor. It isn't. ". Well, the FBI are asking Apple to alter the software on the phone to explicitly allow a brute-force attack. If we are to apply debating-society levels of pedantry [I exaggerate only slightly] then that's not far off the truth. But what the FBI are asking for, then, is for Apple to change the iPhone software such that *anyone* could keep working through the 10,000 combinations until they got lucky. Let's say that the FBI find a dexterous employee able to check PINs at the rate of, say, 1 every 5 seconds. That's 12 per minute, or 720 per hour.

      In other words, the FBI is asking Apple to create a mechanism that would allow *anyone* [with reasonable dexterity and no concerns about RSI] to crack open an iPhone in approximately 14 hours.

      Now let's compare that 14-hour crack with what we'd expect a typical supercomputer to do with current levels of US Government encryption standards. [ i.e. check and see what the NIST recommendations are].

      Sorry, trevor, but in all reasonable interpretations, the FBI are asking Apple to *massively degrade* the security of iPhones. Now if you take a look at documents like this one:-

      https://www.apple.com/business/docs/iOS_Security_Guide.pdf

      from Apple themselves, you quickly see that the company has invested a great deal of time, money and effort into designing and delivering what they believe to be secure products. So just imagine the law suits that would emerge if Apple were to do an about-face and degrade iPhone security in this way. It would happen quicker than you can say "Class Action".

      This aspect of your article is misleading, specious and inappropriate. Correction and retraction, please...

  3. Missing Semicolon Silver badge
    FAIL

    Insecure by design

    .. because Apple want to be able to change the software in your device without your permission, to remove features or to add locks that will potentially increase revenue/reduce jailbreaking.

    1. This post has been deleted by its author

      1. Yet Another Anonymous coward Silver badge

        Re: Insecure by design

        >I wonder if part of the reason Apple is taking a hard line is because Cook is gay.

        I think there is less of a knee jerk reaction to always support the government among the average Apple/Google employee today than there might have been at IBM 20 year ago

    2. John Robson Silver badge

      Re: Insecure by design

      They don't need to sneak it in like this though - they can be quite open about it, and have it as part of a normal upgrade. They could, I'm sure make a mechanism for enforced upgrades to certain modules, but it should still wait for the the phone to be unlocked (maybe allow 5 "wait" prompts?)

      I like the ideas from a previous thread:

      - Firmware/Software should only be updatable on an unlocked phone

      - Charging should only be possible when authenticated*, or powered down

      - Secure boot time passphrase permitted (separate from the unlock screen)

      *Authenticated - maybe allow for configuring certain networks/geofencing for convenient charging at home, but frankly unlocking the phone when you plug it in is hardly a major chore.

      This then means that the phone can't be indefinitely kept "asleep" by a nefarious individual - and brings the boot passphrase into play.

    3. Anonymous Coward
      Anonymous Coward

      Re: Insecure by design

      Can you still get root on the majority of iPhones out there simply by sending a malicious SMS like the other market leader mobile OS? This case shows its security up to now if anything is pretty solid.

      >because Apple want to be able to change the software in your device

      As opposed to certain other companies that are perfectly content to leave the software on your device as they sold it to you and if you want security updates you need to buy the latest model. And Apple can't install updates to your device without your permission unlike Microsoft these days.

      1. Anonymous Coward
        Anonymous Coward

        Re: Insecure by design- malicious SMS

        The short answer is no, if you mean the Mazar Bot; it's a Trojan.

        The message you get is

        "You have received a multimedia message from +[country code] [sender number] Follow the link http: //www.mmsforyou [.] net / mms.apk to view the message."

        And you do have to have enabled installation from unknown sources first.

        Basically you need to have taken the lock off the door and then invited in the complete stranger. It is not quite "simply by sending a malicious SMS". I don't know the percentage of phones that allow installation from untrusted sources, but outside Russia and China it has been claimed to be below 1%.

        1. Anonymous Coward
          Anonymous Coward

          Re: Insecure by design- malicious SMS

          Sorry meant MMS which some of the flaws require no user intervention for root

          http://www.theregister.co.uk/2015/07/27/android_phone_text_flaw/

          and

          http://www.theregister.co.uk/2016/01/04/android_january_fixes/

          And yes skippy you might be patched and protected but you Mr. 1st world IT geek and your new wiz bang phone are hardly representative of all the android phones out in the wild.

  4. phil dude
    Thumb Up

    a few thoughts....

    1) A nice article Trevor, seems to have the right tone.

    2) Until I read the excellent slashdot comment (regarding the feasibility of Apple caving), it was not clear *how* thorough Apple had been. I urge all other commentards to read this thread, as it is informative on a complex topic.

    The optimist in me, wants to see what Google proposes for Android....

    Maybe the tech industry will recognise their future depends on their devices being secure, against all foes....

    P.

    1. Anonymous Coward
      Anonymous Coward

      Re: a few thoughts....

      One proviso to that otherwise excellent article. The phone the terrorist had was apparently a 5c, which is an older one that did not have the secure enclave. Only the 5S and newer have this.

      I'm not really sure if everything he describes there is the same since they made some changes to strengthen the device's security when they added the secure enclave.

  5. Anonymous Coward
    Anonymous Coward

    I must say, the US authorities seemed quite pleased with the Apple failsafe in the past.

  6. ratfox

    This is wishful thinking

    or it is possible to read the data off the flash chips and attack it in a VM until the password is brute-forced.

    I don't see how it could be possible to stop this from happening. If a system exists that allows to enter a 4-digit code and decrypt the device, then surely, it is possible to reconstruct that system so that it does not erase the memory after N attempts.

    The only way I can see to prevent this from happening is to protect the device not with a 4-digit PIN, but with a decryption key that is so long that trying all solutions would take centuries. But I frankly doubt anybody would want to type something that long every time they want to check their email.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like