back to article Why Tim Cook is wrong: A privacy advocate's view

Apple has recently released an open letter explaining why it will challenge a judicial order requiring it to hack the iPhone of one of the accused San Bernardino terrorists. As someone who believes in individual civil liberties and personal privacy above nearly all other considerations, my first instinct is to applaud Apple. …

  1. Anonymous Coward
    Anonymous Coward

    1. All back-door schemes reduce to: Send a plaintext copy of all communications to the government.

    2. All crypto schemes should assume the hash will be stolen for offline cracking by a state-level actor. Key-stretch accordingly.

    3. Don't buy American is already a good principle.

    1. MyffyW Silver badge

      The article seems to suggest that given the common good is served by knowing about this flaw and we can therefore agitate for a patch to it, the boys in blue should be allowed to use it, just this once, because the owner is suspected of being particularly unpleasant.

      I still feel my size 7s losing traction on that a slippery slope.

    2. asdf Silver badge

      except

      >Don't buy American is already a good principle.

      Except the iPhone security is front and center as being pretty solid what with the Feds throwing a fit. Do you really think buying an Xiaomi android because its not American will make you safer (hint: Android full phone encryption tends to be garbage which is why Google backed off requiring it). Yes the US government sucks but they are incompetent enough to let the world know what they are up to unlike many other governments.

      1. Anonymous Coward
        Anonymous Coward

        Re: except

        How often do you travel to China, how often do you travel to the USA?

        How often do you compete with US owned companies vs Chinese companies?

        I don't care if the People's Patriotic Army have a blakc box they can connect my Xiaomi phone to at a security checkpoint becuase I don't go to China and China doesn't make jet engines.

        Having a black box that the TSA can routinely run all ariving foreigners phones through - is an issue to me, and my employer.

        1. Anonymous Coward
          Anonymous Coward

          Re: except

          and more than likely the three letter government agencies will be able to get at what's on your Xiaomi phone much easier than an iPhone regardless of what country you visit. You notice the FBI doesn't single out Google hardly at all when ranting (if nothing else due to a very large percentage of Android phones never being encrypted).

        2. midcapwarrior

          Re: except

          Except they actually do make jet engines

          1. asdf Silver badge

            Re: except

            with parts probably from 25 other countries. But keep believing not buying American or only buying from UK companies makes a difference in this era of multinationals picking where they pay taxes and suing governments for harming their shareholders.

            1. DrBobMatthews

              Re: except

              Oh please! So you thing that Boeing doesn't use parts sourced from companies other than US

              companies! am not a medical Doctor but I have the feelingt hat your severe case of myopia , ignorance and stupidity rules you out of making a valid comment.

            2. DrBobMatthews

              Re: except

              Oh please! So you think that Boeing doesn't use parts sourced from companies other than US

              companies! am not a medical Doctor but I have the feeling that your severe case of myopia , ignorance and stupidity rules you out of making a valid comment.

        3. cosmogoblin
          Black Helicopters

          Re: except

          Is it paranoid to wonder, when it takes my phone/laptop an extra 30 seconds to get through security compared with everybody else's, if they have installed keylogging hardware?

        4. Graham Triggs

          Re: except

          Do you regularly send emails that would give the TSA a reason to consider you a risk to board a plane?

          I have a really hard time with arguments about how bad it is for the FBI / TSA / etc. to have access to your personal details, because of what they might do - no law abiding citizen should have anything to fear from that.

          However, that doesn't mean that the FBI / TSA / etc. should have access to this information, because

          a) Any backdoor that allows the authorities in is immediately a vulnerability that could be exploited by *anyone*

          b) Can you be certain that the authorities systems won't be breached?

          It is *impossible* to protect the human rights of law abiding citizens (you know, the things like keeping them safe from being murdered), if you don't obtain information that "the bad guys" thought they were keeping private (whether that's because of their own deficiencies, or a deliberate breach).

          But proportionality and accountability are important. What the FBI wants is not a consequence-free solution, it creates it's own problems and we have to be honest about that.

          1. 6th

            Re: except

            Whatever. With online backups - just make sure you have factory reset your devices before handing it over to authorities, walk out of the airport etc and reconnect to your profile.

            You can get me to login to my online account over hot coals. If it's insisted you log in to your account - have an empty or dummy account ready - just for them.

            If the agencies are going to treat adults with such contempt to make us out like children - I see no reason why we cannot play the game.

            They started it...

            1. Paul Smith

              Re: except

              Online backups on US owned, US based or US controlled servers are just saving TSA the hassle of hacking your device.

          2. Captain DaFt

            Re: except

            "no law abiding citizen should have anything to fear from that."

            That's already taken care of, There are no law abiding citizens in the US.

          3. Paul Smith

            Re: except

            I had a quick glance at the US constitution and I couldn’t see anything about being protected from murder. Nor did I see anything to gives potential or unnamed victims supremacy over my rights. In fact, the fourth amendment was pretty explicit about what must be done before my rights can be violated and stopping a murder (which this case is not about) is not on the list.

      2. KeithR

        Re: except

        "Yes the US government sucks but they are incompetent enough to let the world know what they are up to unlike many other governments"

        You don't seriously believe that, do you?

      3. Anonymous Coward
        Anonymous Coward

        Re: Xiaomi android because its not American will make you safer

        No. Android is developed by an American Company and without play services is supposedly pretty limited. (According to BlackBerry, hence flushing their own system, that runs non play service Android Apps rather well, down the priv.)

        BB10 would make you safer, but it looks like it will be going the same way as Symbian (also not American.)

    3. ma1010 Silver badge
      Unhappy

      So "Don't Buy American"?

      Yes, the US government, flogged on by assholes like Diane Feinstein (Senator representing the NSA and various other elitist bastards), certainly wants nobody (except her very important self and a few cronies, of course) to have any privacy - or any other rights of any kind, for that matter.

      It's a really good thing that, on the other side of the pond, there's nothing like a Snooper's Charter or any organization like GCHQ that snoops on everyone's traffic. Brits are so lucky to be free of that sort of thing! Not to mention other countries, such as France, where the government would never snoop on their own citizens. Well, not any more than China or Saudi Arabia does, anyway.

      I do strongly agree with the author of this article, but I can't imagine anything like that ever happening. If Apple (or anyone else) came out with really unbreakable encryption, they'd just outlaw it and throw anyone who used it into the Gulag. And that goes on either side of the pond, and in either hemisphere, too. Unfortunately.

      1. DrBobMatthews

        Re: So "Don't Buy American"?

        I trust none of the so called security agencies, least of al the NSA who seem to think that destabilising an elected foreign governmnent by the use of paid thugs to create riots all financed by the State Department is legitemate and legal.This action was of course supported by the whimpering jellyfish of a UK government who have to ask the US permission to wipe their own backsides.

        No longer do the USA and the UK governments represent freedom, they are both quick to condemn other states for their short comings but con their own populations by selling them the lie of securitry.

        Without the this big con, the US and UK arms makers and dealers would go bust.

        Why is the US State Department so worried about a single guy holed up in the Chilean Embassy in London? Are they terrified he might just release even more truths about the us, uk, nato and the dodgy nuclear deals with Israel? or are they afraid that the truth might be revealed regarding which countries missile brought down a civilian airliner? History tells us that the US is very adept at downing civilian airliners and decorating those responsible. Of the major powers, the US and the UK are not to be trusted, neither for that matter are the USSR, Israel or Saudi Arabia they are all "goverened by melagomaniacs.

  2. Alister Silver badge

    Either way, it would appear to your correspondent that Apple screwed up when designing this device and it left open a means of attack. The judge is asking Apple to use its expertise to exploit this flaw. It's as simple as that.

    As far as I am concerned, the judge is in the right here. Apple is not being ordered to create a flaw and distribute it to all devices. It is not being prevented from fixing this flaw in future devices. It is being asked to exploit a flaw that currently exists, and for the privacy-conscious this is actually a good thing.

    I'm sorry Trevor, but I think you may be extrapolating too far here.

    The judge has asked Apple to assist as you say, by disabling the 10-strikes-and-you're-out mechanism on the PIN, but I don't see that this means there is necessarily a flaw which allows that, it may just be wishful thinking on the part of the judge, much as the requests to provide "encryption which is breakable only by governments" which have sprung from clueless officials on both sides of the Atlantic.

    1. Dan 55 Silver badge

      Search "How Apple's encryption works" in this text...

      http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-your-iphone.html

      The secure enclave enforces a five-second delay between failed PIN requests, it can't return the encryption keys, and they'd have to flash new firmware onto the running device into the first place, without the device's co-operation.

      1. Anonymous Coward
        Anonymous Coward

        How does the secure enclave times the five seconds? What if you make those five "seconds" last five ms or even less?

        1. Dan 55 Silver badge

          By overclocking the device by several orders of magnitude? It'd crash the CPU I expect.

          1. Danny 14 Silver badge

            aye but

            All code cant be altered or bypassed and apple might know how. Or they might know how to clone an identical phone or clone to a VM; 10000 combinations isnt a lot so a 1000 vms will be needed for 10 tries each.

            The fbi has gone to a court to get apple help it with one device. There is nothing of the fbi wanting a permanent method for all uphones.

            1. Doctor Syntax Silver badge

              Re: aye but

              "The fbi has gone to a court to get apple help it with one device. There is nothing of the fbi wanting a permanent method for all uphones."

              There's a good deal of wanting to use an unusual set of circumstances to set a precedent for lots of other circumstances.

            2. Danny 2 Silver badge

              Re: aye but

              The correct phrase is 'Aye, right'. Nobody says "Aye, but". Not once, not ever. Are you trying to discredit me and the presumably other 12 or so Dannys here. What a daft post. Utter mince.

            3. David Walker

              Re: aye but

              Its all just speculation so... If it was simple design to exploit the FBI would do it and not need Apple. Tim's email seems to imply that there needs to be hardware produced in combination with software and that it could be used on other devices. Probably a firmware or hardware certificate only known to Apple is needed and this combination might apply to a range of production units. Even a demonstration in principle could create a huge legal battle for Apple. I'm sure the army of "they should do it for god and country and to get those nasty terrorists" include a few ambulance chasers that would immediately class action sue the company for falsely claiming they can't access data when the FBI will very publicly announce that they did. You see in America suing is a national pastime. Hell you can sue someone for saving your life using CPR if the compression technique that was used leaves a boo-boo. A lobby group funded by trial-judges helped defeat laws to reduce patent troll cases - how dare someone ruin that fun. A woman sued 'cause her hot coffee was too hot - and won. So this case is just one of a string of cases that will provide entertainment and lobby groups $$ for years to come. Oh and if it does get pushed up to the Supreme Court - good luck politics will keep them out of commission for at least 1.5 years. Well time to make some popcorn and turn on C-SPAN its going to be a long long night.

      2. calmeilles

        It seems that the phone in question is a 5c and doesn't have the secure enclave.

        https://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/

      3. Pseu Donyme

        > http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-your-iphone.html

        Thanks for the link! :)

        The article explains why Apple can in fact decrypt an iPhone given physical access. A simplified version seems to be that the AES keys are ultimately derived from the passcode and a 256-bit unique per chip key (UID) baked in the core SoC (A6 in the case of a 5c) at the time of manufacture. The UID is not accessible to software as such, but can be fed to the AES hardware via an internal hardware path. This means that firmware could brute force the derived key(s) by going trough the passcodes using the AES hardware and seeing what decrypts. For simple/short keys at least this is quite feasible: using the 80 ms per iteration from the above article a 4 digit code space would be completely covered in 800 s.

        The 5c / A6 does not have the 'secure enclave' so this is not a consideration for the case at hand, but since the code running there is also a part of the firmware provided by Apple this wouldn't seem to make a difference for the newer models (from 5s / A7 onward) with it. Also, apparently, the ten try limit and the increasing delay between tries are just firmware features.

        1. DougS Silver badge

          As I understand it, the 10 try limit is enforced by the secure enclave on the 5S and newer. So this trick would only work on a 5c. But I doubt the Feds would appreciate the distinction and still ask Apple to find a way to break into a newer phone next time.

    2. gumbril

      I don't think he is, or at least if he is, then it's because Apple is not giving the obvious defence. That being "We can't", instead they are saying "We shouldn't". Maybe after "We shouldn't" is ruled on they come out with "We can't"

      Now it maybe Apple wan't to blow the "We shouldn't" debate open, which is happening, good, but I'd have thought that Apple would have to bring all their objections to the table in one go, court's tend to get miffed if things are left out, which you knew at the time, are relevant, and are only released piecemeal.

    3. This post has been deleted by its author

    4. Joe Gurman

      Amen

      I believe Mr. Potts is attempting to make a distinction without a difference.

    5. Adam 52 Silver badge

      There is an important legal question. The FBI are asking Apple to do work. Rather than hiring them (or a specialist security firm) and paying commercial rates they're trying to use a court order. There does need to be legal clarity on how much a judge can order - can he order Apple to devote an engineer for 1 week, 10 engineers for 10 years, 100 engineers for 100 years? Do they have to be the best available?

    6. Bob Dole (tm)

      Consider that data centers spend an absurd amount of money on physical security for database servers. The reason is that once a device is physically compromised it is only a matter of time for the contents of that device to be decrypted. I don't know why people think hand held devices are any different. No one can protect a device once it is physically on the hands of someone else. If they could, data centers could dispense with a huge expense.

      The PIN code issue is a red herring. The FBI can copy the memory of the phone without turning it on. Once the memory is copied a simple scripted back can be leveraged to find the right pin code. There's only 10,000 possibilities so this isn't even time consuming. So that's not what this is about.

      They aren't directly asking Apple to put a back door in. The approach is to force a company to develop the tools to compromise their own kit. Why? So the FBI doesn't have to do it for Microsoft, Cisco, Samsung or anyone else's devices. It costs money to keep programmers employed.

      So if the fbi can use the court system to essentially enact a law to force companies to provide a way to decrypt their own kit the natural thing businesses will do is program one in the future. This is how businesses work: if forced to comply with dorm restriction they'll take the easiest path to compliance. Also because congress has so far backed away from enacting laws forcing compliance it's the only way the fbi is going to get what they want.

      This whole thing is deplorable and should be fought tooth and nail at every step. Kudos to Apple for doing so. I think I'll buy another iPad to support them.

      1. DougS Silver badge

        Copying the memory will not work

        The phone's flash isn't encrypted by the PIN number, it is encrypted by a 128 bit (or maybe it is 256 bit) AES key which is unlocked by the PIN - though since this is a 5C that doesn't have the secure enclave I'm not sure how that part works.

        Either way if it was as simple as copying the flash and trying only 10,000 ways to decrypt it the FBI would have done that without even asking Apple.

    7. micheal

      disabling the 10-strikes-and-you're-out mechanism on the PIN

      But if Apple can do it, so can the hackers....

      I'd rather Apple tried / succeeded / fixed it, they even save on the "bug bounty" cost....what's not to like

      1. Roland6 Silver badge

        Re: disabling the 10-strikes-and-you're-out mechanism on the PIN

        But if Apple can do it, so can the hackers...

        If the hackers can do it, Apple have lost control of their private keys used to sign iOS updates...

    8. Sproggit

      Specious At Best, Wrong At Worst

      When Trevor writes, "Apple is wrong is in saying that the FBI is asking for a backdoor. It isn't. ", he is misrepresenting the facts as I've seen them reported.

      My understanding is: the Apple iPhone in question has been locked using the integral PIN locking mechanism. This has 4 digits and therefore 10000 combinations [0000-9999]. It also has a mechanism such that if someone enters the wrong value 10 times, the phone will wipe it's data. What the FBI are asking for, however, is a mechanism to obviate the "10 strikes" rule built in to the iPhone.

      So let's go back.

      Trevor wrote, "Apple is wrong is in saying that the FBI is asking for a backdoor. It isn't. ". Well, the FBI are asking Apple to alter the software on the phone to explicitly allow a brute-force attack. If we are to apply debating-society levels of pedantry [I exaggerate only slightly] then that's not far off the truth. But what the FBI are asking for, then, is for Apple to change the iPhone software such that *anyone* could keep working through the 10,000 combinations until they got lucky. Let's say that the FBI find a dexterous employee able to check PINs at the rate of, say, 1 every 5 seconds. That's 12 per minute, or 720 per hour.

      In other words, the FBI is asking Apple to create a mechanism that would allow *anyone* [with reasonable dexterity and no concerns about RSI] to crack open an iPhone in approximately 14 hours.

      Now let's compare that 14-hour crack with what we'd expect a typical supercomputer to do with current levels of US Government encryption standards. [ i.e. check and see what the NIST recommendations are].

      Sorry, trevor, but in all reasonable interpretations, the FBI are asking Apple to *massively degrade* the security of iPhones. Now if you take a look at documents like this one:-

      https://www.apple.com/business/docs/iOS_Security_Guide.pdf

      from Apple themselves, you quickly see that the company has invested a great deal of time, money and effort into designing and delivering what they believe to be secure products. So just imagine the law suits that would emerge if Apple were to do an about-face and degrade iPhone security in this way. It would happen quicker than you can say "Class Action".

      This aspect of your article is misleading, specious and inappropriate. Correction and retraction, please...

  3. Missing Semicolon
    FAIL

    Insecure by design

    .. because Apple want to be able to change the software in your device without your permission, to remove features or to add locks that will potentially increase revenue/reduce jailbreaking.

    1. This post has been deleted by its author

      1. Yet Another Anonymous coward Silver badge

        Re: Insecure by design

        >I wonder if part of the reason Apple is taking a hard line is because Cook is gay.

        I think there is less of a knee jerk reaction to always support the government among the average Apple/Google employee today than there might have been at IBM 20 year ago

    2. John Robson Silver badge

      Re: Insecure by design

      They don't need to sneak it in like this though - they can be quite open about it, and have it as part of a normal upgrade. They could, I'm sure make a mechanism for enforced upgrades to certain modules, but it should still wait for the the phone to be unlocked (maybe allow 5 "wait" prompts?)

      I like the ideas from a previous thread:

      - Firmware/Software should only be updatable on an unlocked phone

      - Charging should only be possible when authenticated*, or powered down

      - Secure boot time passphrase permitted (separate from the unlock screen)

      *Authenticated - maybe allow for configuring certain networks/geofencing for convenient charging at home, but frankly unlocking the phone when you plug it in is hardly a major chore.

      This then means that the phone can't be indefinitely kept "asleep" by a nefarious individual - and brings the boot passphrase into play.

    3. Anonymous Coward
      Anonymous Coward

      Re: Insecure by design

      Can you still get root on the majority of iPhones out there simply by sending a malicious SMS like the other market leader mobile OS? This case shows its security up to now if anything is pretty solid.

      >because Apple want to be able to change the software in your device

      As opposed to certain other companies that are perfectly content to leave the software on your device as they sold it to you and if you want security updates you need to buy the latest model. And Apple can't install updates to your device without your permission unlike Microsoft these days.

      1. Anonymous Coward
        Anonymous Coward

        Re: Insecure by design- malicious SMS

        The short answer is no, if you mean the Mazar Bot; it's a Trojan.

        The message you get is

        "You have received a multimedia message from +[country code] [sender number] Follow the link http: //www.mmsforyou [.] net / mms.apk to view the message."

        And you do have to have enabled installation from unknown sources first.

        Basically you need to have taken the lock off the door and then invited in the complete stranger. It is not quite "simply by sending a malicious SMS". I don't know the percentage of phones that allow installation from untrusted sources, but outside Russia and China it has been claimed to be below 1%.

        1. Anonymous Coward
          Anonymous Coward

          Re: Insecure by design- malicious SMS

          Sorry meant MMS which some of the flaws require no user intervention for root

          http://www.theregister.co.uk/2015/07/27/android_phone_text_flaw/

          and

          http://www.theregister.co.uk/2016/01/04/android_january_fixes/

          And yes skippy you might be patched and protected but you Mr. 1st world IT geek and your new wiz bang phone are hardly representative of all the android phones out in the wild.

  4. phil dude
    Thumb Up

    a few thoughts....

    1) A nice article Trevor, seems to have the right tone.

    2) Until I read the excellent slashdot comment (regarding the feasibility of Apple caving), it was not clear *how* thorough Apple had been. I urge all other commentards to read this thread, as it is informative on a complex topic.

    The optimist in me, wants to see what Google proposes for Android....

    Maybe the tech industry will recognise their future depends on their devices being secure, against all foes....

    P.

    1. DougS Silver badge

      Re: a few thoughts....

      One proviso to that otherwise excellent article. The phone the terrorist had was apparently a 5c, which is an older one that did not have the secure enclave. Only the 5S and newer have this.

      I'm not really sure if everything he describes there is the same since they made some changes to strengthen the device's security when they added the secure enclave.

  5. Anonymous Coward
    Anonymous Coward

    I must say, the US authorities seemed quite pleased with the Apple failsafe in the past.

  6. ratfox Silver badge

    This is wishful thinking

    or it is possible to read the data off the flash chips and attack it in a VM until the password is brute-forced.

    I don't see how it could be possible to stop this from happening. If a system exists that allows to enter a 4-digit code and decrypt the device, then surely, it is possible to reconstruct that system so that it does not erase the memory after N attempts.

    The only way I can see to prevent this from happening is to protect the device not with a 4-digit PIN, but with a decryption key that is so long that trying all solutions would take centuries. But I frankly doubt anybody would want to type something that long every time they want to check their email.

    1. Anonymous Coward
      Anonymous Coward

      Re: This is wishful thinking

      You don't need to. Somewhere this data is stored.

      Now, how any times did you have a debit/credit that expires and then when the new one comes the PIN is the same as the old one? I find that spooky.

      So I guess (never guess!) this is what happens with smartarse phones.

      1. Steve K Silver badge

        Re: This is wishful thinking

        > Now, how any times did you have a debit/credit that expires and then when the new one comes the PIN is the same as the old one? I find that spooky.

        Might be missing the point here, but if it's got the same card number then isn't that expected (since the stored hash will be cryptographically based on the long card number and the PIN plus maybe some other salting)?

        The naked PIN itself will be unknown to the issuer since otherwise that breaks everything.

        Steve

        1. Captain Queeg

          Re: This is wishful thinking

          Many cards change the long number on re issue - Barclays Visa Debit cards spring to mind.

          At least the last four digits always change. So a fixed 16 digit card number isn't guaranteed or necessary.

          1. Danny 14 Silver badge

            Re: This is wishful thinking

            Or duplicating the chip in software and cloning the data into a vm. Afterall apple will know how the chips generate their keys and it might not be a shrouded in voodoo method. It might even have a process or logs of them (or the seed process etc)

            Simply duplicate to vm and restore the vm after 10 tries.

      2. Rob Crawford

        Re: This is wishful thinking

        Perhaps because the hash for the PIN is server side and the new card record is simply pointed at the old hash?

    2. Paul Crawford Silver badge

      Re: This is wishful thinking

      If you read the slashdot article you see why - the limit on brute force is largely in the crypto chip. The key used for the data is massive - 256 bits symmetric AES - and is largely revealed by the crypto chip on success, so its not a 4-6 digit PIN worth of tries. So the options are:

      1) Brute-forcing a 256-bit key, possible with NSA resources I guess, but a serious challenge.

      2) Somehow compromising the crypto chip. How hard that is depends on its design, maybe it can be done as sloppy mistakes, or maybe it really is properly tamper-proof and then Apple's position is 100% correct - it simply can't do it.

      1. Anonymous Coward
        Anonymous Coward

        Re: This is wishful thinking

        What if you feed PINs to directly to the chip until it reveals the key? If you have the pyshical device, you can disassemble it and attack hardware directly, you don't need to go through all the software... the chip enforce a delay? Can you shorten that delay somehow feeding the chip different hardware inputs?

      2. tom dial Silver badge

        Re: This is wishful thinking

        If my arithmetic is approximately correct, brute-forcing a 256 bit key could be expected to take 7 or 8 times the current age of the universe - if you applied 10 billion machines, each capable of performing 10 billion encryption operations a second.

        It is not within the capability of NSA or anyone else.

        1. Jaybus

          Re: This is wishful thinking

          Ah, but this is why they want the hacked firmware. A7 (and newer?) processors have a cryptographic hardware co-processor known as "Secure Enclave". This device contains a unique AES-256 key that is compiled into the device when manufactured and unknown to either Apple or the manufacturer. The 6-digit passcode is entangled with this UID key by passing both through the PBKDF2-AES algorithm, a hash purposefully designed to consume lots of CPU cycles.

          The Secure Enclave firmware is itself encrypted, and only Apple holds the signing keys. Even Apple cannot directly discover the Secure Enclave's UID, but only the result obtained by passing known tokens through it. While this is by no means a backdoor, it is possible to create a UID guessing firmware.

          The issues, for the 3-letter agencies, or any other attacker, are that the firmware will/may wipe itself after too many failed attempts and attempts must be entered manually. A hacked firmware could circumvent those issues and allow a vastly more expedited brute force attack. While the 3-letter agencies certainly have people capable of hacking the firmware themselves, only Apple has the firmware signing keys. This allows them to brute force the 6-digit passcode, NOT the AES-256 key.

          The interesting bit is that the justice did not side against privacy with the decision. The iPhone in question is not the suspect's phone, but rather was his work phone and the property of San Bernadino County. Essentially, the government wants to break into one of its own phones. From the point of view of the justice, there is no privacy issue in this case. But the justice is purposefully limiting the scope to just this case. Apple is being a bit more far sighted and refusing to give the government a tool to use in future, not to mention the possibility of an Edward Snowden wannabe selling it off into the wild.

          So, no, Tim Cook is NOT wrong.

  7. Gordon 10 Silver badge
    FAIL

    Trevor's missing the point

    Its a big assumption that the judge/plod knows of a flaw - they could easily be on a fishing expedition.

    Not sure what the waffle about having a secure device is about. Both Apple and Google have regular patching programs - arguably apples is more effective (given the lack of updates on some androids).

    They seem as keen as any tech firm to keep the updates coming - not sure how that makes them any more or less "secure" than anyone else.

    And frankly its irrelevant to the ask of the Judge in this case.

    1. Danny 14 Silver badge

      Re: Trevor's missing the point

      They arent looking for a flaw they are asking apple to disable a process. Not using a known exiating flaw but using apple expertise in CREATING a flaw on that phone.

      1. Anonymous Bullard

        Re: Trevor's missing the point

        The flaw is, it's possible to create a flaw.

        1. Sirius Lee

          Re: Trevor's missing the point

          Yes. And Potts has missed the point for years now.

  8. Jason Bloomberg Silver badge

    Our law enforcement cannot be trusted, and neither can our government.

    Quite possibly. Which means we're collectively fucked as no one will agree what should be done and no one can agree on who decides what will be done.

    1. Anonymous Coward
      Anonymous Coward

      Which means civilization is basically a crapshot...problem being the only thing worse than civilization is the only alternative to civilization: anarchy.

  9. Lysenko

    Not even wrong...

    This entire edifice is based on the demonstrably flawed premise that terrorism is a significant threat warranting special consideration. It isn't.

    Apple is allegedly working on a car. Road traffic accidents kill between 30k and 40k Americans every single year. Diverting any resources at all (which means incurring any costs at all) from self driving cars to iPhone cracking is self evidently a misapplication of effort.

    Apple is correct in refusing to set a precedent by getting dragged into politically motivated, principle abrogating crusades against threats that statistically barely exist.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not even wrong...

      These people have killed fourteen people. The one in France 130. Is this not a significan threat?

      I guess if there was a huge bomb in Cupertino using an iPhone as the fuse, Apple will be able to access it remotely in a few minutes....

    2. boltar Silver badge

      Re: Not even wrong...

      "This entire edifice is based on the demonstrably flawed premise that terrorism is a significant threat warranting special consideration. It isn't."

      Isn't it? Oh ok, you'd better tell that to the 2 million Syrians who've fled their country thanks to ISIS. Also tell it to the relatives of 9/11.

      "Road traffic accidents kill between 30k and 40k Americans every single year"

      Oh spare us the "X happens so why care about Y" false comparisons. People undertake road travel voluntarily knowing what the risks are, terrorism is involuntary. Perhaps you'd be happy flying on a plane knowing no one had bothered to check the luggage because the 200 or so of you on there is less than the daily roadkill so who cares if you die?

      1. Anonymous Coward
        Anonymous Coward

        Re: Not even wrong...

        > Also tell it to the relatives of 9/11.

        Tell that to 6 million of my relatives whose deaths were justified for "national security".

        There is a historical opinion that the fact the German word for safety and security are the same made it easier to claim a "think of the children" justification for the holocaust.

        1. This post has been deleted by its author

        2. asdf Silver badge

          Re: Not even wrong...

          > Also tell it to the relatives of 9/11.

          Whose pity helped give us the useless Department of Homeland Security for eternity. Yeah don't care much about their opinions at this point.

        3. boltar Silver badge

          Re: Not even wrong...

          "Tell that to 6 million of my relatives whose deaths were justified for "national security"."

          Congratulations - you win todays Godwin award. Equating genocide with prevention of terrorism measures whether you agree with them or not is distasteful and pathetic, and your attempt at claiming inherited victimhood on the back of your ancestors suffering is just sick.

      2. Lysenko

        Re: Not even wrong...

        Syria is a civil war, not a terrorist situation.

        9/11, besides being unique, was still about 10% of the RTA body count that year and about 30% of the firearms one. The numbers can't lie. iCar reliability matters way more than catching Jihadis and any spare resources are better directed towards developing the iGun (which can be remotely disabled and requires a PIN code to activate).

        As for the plane issue, I'm probably a bit blase because I've been in close proximity to three terrorists bombings and I am WAY more intimidated by my colleague's driving than I am by the tube.

        If you're asking me if I would vote to scrap most airline "security" procedures and roll the dice then the answer is "yes" ... for the same reason I walk next to the local golf course without a crash helmet on ... statistically insignificant threat.

      3. Graham Cobb

        Re: Not even wrong...

        Perhaps you'd be happy flying on a plane knowing no one had bothered to check the luggage because the 200 or so of you on there is less than the daily roadkill so who cares if you die?

        Absolutely yes. Without a doubt. Unless the stats had changed so that the risk of flying came near to the other risks -- which would happen after a while, of course, if we stopped checks which are actually useful.

        If some check has little impact on the risk numbers (for example, if it is ineffective, like much security theatre) then I have no problem going without it. A few hundred deaths a year won't worry me, until it gets to be comparable with other risks I take every day (like driving to the airport).

    3. scm2njs

      Re: Not even wrong...

      It comes down to phycology; a car accident is just that an accident but it’s also a random act no follow through no direction. When it comes to terrorism the game changes; it’s a violent attack on the hearts and minds of ordinary people. In this ways it also different from wars even when those wars take place in urban areas the civilian population are usually collateral damage or at worse they are not the target the soldiers or fighters are.

      This is why Terrorism is such an emotive subject because its not random its not "Sorry we didn't mean to, you were just in the way and I wanted to get that guy over there" its specific, its targeted and it goes after soft targets. Night clubs, café's, hotels and schools are where we live physiologically we associate them with safety & fun. Attacking them has a huge impact on the people there and the general populace because suddenly the places where we'd go to relax are death traps. This then hits the service industry which has knock on effects all the way up the economic house of cards.

      Is terrorism a big threat probably not but a single terrorist attack has a much bigger impact that all the car accidents you could point to. In reality a lot of plots are foiled through poor planning or slip ups along the way but regardless of what the individual governments say about encryption it's out there and if you enforce draconian laws to compromise it the only people that are likely to be impact are law-abiding citizens because the bad guys will just use a different product or will implement their own application to do it for them.

      1. Anonymous Coward
        Anonymous Coward

        Re: Not even wrong...

        It does come down to psychology. The fear of terrorism causes far more damage than the terrorism itself. Still much more likely to die from lightning than a terrorist but if the terrorists can cause your 401k to half in value like in 2001 John Q Public is ready to sacrifice his grand kids civil rights and not think twice about doing it. People in general are lazy, stupid, greedy and selfish. Not everyone obviously but as a collective yes.

      2. Lysenko

        Re: Not even wrong...

        You can apply that logic to any mass murder, including the school/college shootings the USA experiences every few weeks.

        Unless you're going to confer extra validation upon murderers who happen to have religious/political motives, the proposition instantly becomes: hack anyone who is suspected of being involved in or associated with any potentially deadly violent crime. Add a sprinkling of paedos, drugs and general "think of the children" and you know where it ends up.

        We lived with the IRA for 30 odd years, some of us more closely than others. As soon as you treat murderous scum as anything more than a minor irritation - they win.

        Inventing the "Department of Jihadi Super Significance" was a bone headed idea and setting new legal precedents to underline how important they are (and how scared we are) is equally wrong.

        1. Danny 14 Silver badge

          Re: Not even wrong...

          Syria is both a civil war and a terrorist situation. ISIS control chunks, the 'regime' control chunks and various separatist movements own chunks. I think the correct term is clusterfuck.

      3. Doctor Syntax Silver badge

        Re: Not even wrong...

        "It comes down to phycology"

        What does phycology have to do with it?

  10. John 104 Silver badge

    Nice Article. Still Wrong, Trev.

    Give the US government the tool to unlock the phone and you can bet that they will illegally use it in the future from the biggest court cases all the way down to petty crime. Don't forget, this is the same government that has been hijacking your cell communications for years. All in the name of crime fighting. Don't forget, this is the same government that has been collecting all of your internet traffic, without a legal warrant, for years. Yeah, let's give them more tools to invade our privacy.

    1. Oh Matron!

      Re: Nice Article. Still Wrong, Trev.

      This. Several times over

      This appears to have been designed very well. The only back you're going to get round this is to exploit a vuln. If there's a vuln to be exploited, it will be exploited by every man and his chien.

      The FBI should go back and ask the guy for his PIN..... Oh, wait a minute....

      1. DougS Silver badge

        There's a larger problem than that

        If they do it once the FBI will demand it again and again. Next time it won't be terrorism, it will be pedophiles. Then it will be drug lords. Then it will be tax cheats. Apple will get a lot of these requests and be required by the court to create a special version of iOS for each one that will only install on one particular phone. Maybe the court says that this process slows things down too much, so compels Apple to create a version of iOS that the FBI can install on ANY phone to brute force the passcode. And order them to find a solution around a password so it won't only be dumb people who use a 4 or 6 digit passcode who'd be vulnerable to the FBI bypass.

        Even if it was just this one instance, what stops the UK, France, Germany, Russia, China, Iran etc. from putting the screws on Apple - "we know you can do this since you did in the US, if you don't do this for us you are banned from selling any products in this country". Maybe not much of a threat in Iran, but in China which is now Apple's largest market? Do those who think giving the FBI the ability to bypass iOS passcodes for 'probable cause' think that giving it to China or Russia is also fine? How does Apple say no to them if they are forced to say yes to the FBI?

        1. Tom 64
          Flame

          Re: There's a larger problem than that

          It'll actually get worse from there I imagine.

          It wouldn't take much for some bright, corrupt spark in the FBI or other law enforcement outfit then sell the vulnerable version of iOS on the black market. At that point its open season on iOS

    2. Anonymous Coward
      Anonymous Coward

      Re: Nice Article. Still Wrong, Trev.

      I think Pott's point was companies shouldn't be creating devices where the only thing preventing their disclosure is policy. It should instead be technically impossible (infeasible) to crack it.

      Apple, while I love their products, are just using this as "we defy the government for your protection" PR glossing over the fact their (older) devices aren't as secure as they make out.

  11. Joeman
    Trollface

    one million dollars..

    Cue the first publicly announced, state funded hacking contest!!

    FBI will pay $1m to anyone who can crack the iPhone... Then out of spite they will publish the source code on the web to p!ss off apple for being terrorist supporters.

    1. Danny 14 Silver badge

      Re: one million dollars..

      FBI is not the NSA. At least the FBI tok the correct route through the courts (irrelevant of wherher they are successful in the end). I doubt the NSA would have been so transparent.

      1. Roland6 Silver badge

        Re: one million dollars..

        At least the FBI tok the correct route through the courts

        Hence why Tim Cook can talk about it...

        If you look at Facebook's statement, that some who can't read claim supports Apple, and understand the Snowden disclosures, you can see that Facebook are alluding to requests being made from authorities that it is unable to talk able...

  12. lansalot

    Doesn't read to me like there's a design flaw - FBI are asking if it's possible to do this ("make a new version of the iPhone OS"), and Apple are saying "we've been asked to do this and we're not, as it's a bad thing; it introduces a design flaw" - which by implication would suggest that it's not already there and as such it's currently a pretty watertight design.

    Trevor's argument sounds a bit like saying "if you reprogram trucrypt and can somehow install it, then you can have the contents of my hard disk without my password". Which I don't think sounds very likely to anyone.

    1. Danny 14 Silver badge

      The software is based on hardware. If apple can duplicate the hardware EXACTLY into a vm then the FBI get unlimited tries. This isnt anything voodoo like and certainly isnt possible for every case.

      1. DougS Silver badge

        You have to copy the information currently stored on the phone into that VM (assuming they even have such a VM and wouldn't need months to develop it) The phone's hardware is designed to make that difficult, it is more than just copying the data that's in the flash. You would at minimum need to decap two chips and read the two IDs that make up the separate halves of the encryption key using an electron microscope. On newer phones than the 5c at issue here, you'd need to compromise the secure enclave, which is probably designed using tamper proof methods meaning you'd have to do this work in an unlit vacuum.

  13. Brewster's Angle Grinder Silver badge

    And in other news...

    ...a judge orders Ferrari to build a car that travels faster than the speed of light. More footage in yesterday's news.

  14. The Islander

    Concerned? Just a bit

    Is it not relevant that the outcome of this action, if successful, is exactly what privacy proponents want to avoid?

    The semantics may be correct, it is not trying to break the encryption. But if it is possible to work around it, is the implementation not flawed?

    What value the much vaunted pure maths / statistically small chance of "breaking" encryption when a means exists to subvert it?

    How long will it take before this precedent in law becomes the norm? And how will it be restricted to vendor and law enforcement?

    I concur that vendors peddling such goods need to be far more forthright in their claims.

  15. Thomas Wolf

    Why Trev Pott is wrong - a privacy advocate's view

    Trev,

    You're absolutely right that what the magistrate is asking Apple to do is not a "backdoor in the context of encryption". But your context is wrong. A special OS firmware build that disables the "slow-down after x tries" is effectively a backdoor into the device because it makes it possible for someone to get in (with a 'brute force' key). So now the government has this special build Apple gave it that will make it possible to get into any iPhone 5c - with or without warrant. Of course the US government wouldn't abuse that build. I'm sure it won't find its way into NSA's hands....CIA....whatever. Nah.

    And then there's the 'precedence' thing that you ignore: if Tim Cook develops this special version of the OS for the US government, how can he refuse similar 'legal' requests by other governments? How comfortable would you feel when you hand your iPhone to some border official in, say, China. He disappears briefly with your phone into some back office. Comes back smiling, telling you you should lay off the porn. On a more serious note: how safe can a dissident feel?

    The problem is, of course, "legal" is defined by the country making requests. And these countries don't always have the same definition that we do. It is indeed, as Tim Cook put it, a slippery slope when you begin accede to one government.

    1. boltar Silver badge

      Re: Why Trev Pott is wrong - a privacy advocate's view

      "So now the government has this special build Apple gave it that will make it possible to get into any iPhone 5c - with or without warrant. Of course the US government wouldn't abuse that build. I'm sure it won't find its way into NSA's hands....CIA....whatever. Nah."

      So you're saying privacy on a fucking phone trumps everything else including national security?

      You give out far more personal data on a daily basis to function in modern society but you're so worried about your porn collection and texts from mistresses that you'd happily let potential useful information about terrorists be withheld? GTFU.

      Yes, mod me down teenage SJWs.

      1. This post has been deleted by its author

      2. Anonymous Coward
        Anonymous Coward

        Re: Why Trev Pott is wrong - a privacy advocate's view

        Don't worry Gramps. The big bad terrorists aren't coming for all the entitlements you voted for yourself. The world will be safe for you to leech on for at least another few decades.

      3. Thomas Wolf

        Re: Why Trev Pott is wrong - a privacy advocate's view

        You're simply wrong.

        I don't give up my social security number or that of my family "on a daily basis to function in modern society". Neither do I divulge my various account numbers. Neither do I divulge private photos. But all these things are on my iPhone. And all these things can be used to harm me financially.

        Darn right, I think privacy trumps national security - especially when the threat to national security are way overblown. The likelihood of being harmed by terrorists in the US is currently about 1000x smaller than being killed by some hillbilly's gun....yet we haven't passed even the smallest gun legislation in decades. Why are we so scared of a few terrorists that we're willing to give up our liberties so easily?

        1. Anonymous Coward
          Anonymous Coward

          Re: Why Trev Pott is wrong - a privacy advocate's view

          >Why are we so scared of a few terrorists that we're willing to give up our liberties so easily?

          Because the US is so right wing it often threatens its first world status? Gutting public education was a brilliant move a generation back.

      4. KeithR

        Re: Why Trev Pott is wrong - a privacy advocate's view

        Christ - you'd think terrorism was INVENTED on 9/11...

        I'm old enough to remember IRA bombs going off in the UK - bombs paid for by US sponsorship.

        So STFU about terrorism as if it only became worth worrying about once the US got a taste of it.

      5. KeithR

        Re: Why Trev Pott is wrong - a privacy advocate's view

        "So you're saying privacy on a fucking phone trumps everything else including national security?"

        I'm saying MY privacy (my RIGHT to privacy) trumps YOUR national security - or the paranoia that feeds it...

    2. Trevor_Pott Gold badge

      Re: Why Trev Pott is wrong - a privacy advocate's view

      "A special OS firmware build that disables the "slow-down after x tries" is effectively a backdoor into the device because it makes it possible for someone to get in (with a 'brute force' key). So now the government has this special build Apple gave it that will make it possible to get into any iPhone 5c - with or without warrant."

      Yes. If it works, if Apple can deliver then we know the iPhone 5c is vulnerable and never lived up to the claims that it would protect your privacy in the first place.

      Truth is more important than a comforting lie. Even when that truth is the bitter realization that the trust you placed in a company's advertising was misplaced.

      If the company can compromise the phone, then the phone is flawed, period.

      We need to know which devices are secure, not hide behind some technicality of law to protect us. The bad guys don't honour the law.

      1. DougS Silver badge

        Re: Why Trev Pott is wrong - a privacy advocate's view

        A flaw that only Apple can take advantage of (since only they can create iOS updates that are signed with an encrypted certificate held only by Apple) is not the same thing as a flaw that a random hacker or even state actor can take advantage of. Unless they can break into Apple and steal the key used to sign iOS updates. Hopefully Apple restricts access to that to a few people, and keeps it on an air gapped system, but obviously I have no knowledge of their procedures.

        1. Charles 9 Silver badge

          Re: Why Trev Pott is wrong - a privacy advocate's view

          "Unless they can break into Apple and steal the key used to sign iOS updates. Hopefully Apple restricts access to that to a few people, and keeps it on an air gapped system, but obviously I have no knowledge of their procedures."

          If the bad guys want something badly enough, they'll hire insiders. Or find weaknesses. Remember, at least one of Sony's PS3 private keys got compromised and more and more malware is being signed with genuine keys that were likely stolen (so they not only can pass authentication checks but also can't be voided without collateral damage), so it's not outside the realm of reality.

        2. Trevor_Pott Gold badge

          Re: Why Trev Pott is wrong - a privacy advocate's view

          Except that isn't the case. Apple isn't an island and this idea that corporations exist independently of the societies in which they operate is absurd.

          If Apple (the corporation) or Apple (the individuals who make up the corporation) can be compelled by any legal system - or by a man in mask with a $5 wrench - to create a tool/custom firmware/what-have-you that can "hack" a phone, that phone is not secure. Anything else is handwaving away responsibility and hiding behind legal tricks instead of producing a secure device.

          I'm sorry, but Apple is beholden to the laws of the United States of America and - to a lesser extent - those other nations in which it operates. If a legal means exists to force them to hack the phone then the only difference from there being a big red button that says "hack me" is the number of malicious actors that can exploit that flaw. (Namely governments and those who know which people to hit with a wrench.)

          If such a flaw exists, we deserve to know this. It should be made very clear and very public. Knowing exactly which classes of malicious actors can impinge our civil liberties and violate our privacy is part of being informed about the devices we purchase. Given that everything these days is "licensed" and not "purchased", I'd go so far as to say it is part of informed consent for participating in a contract with a service provider.

          This judge looks set to provide us, the consumer, with critical information. I sincerely hope Apple are unable to hide behind legal trickery and handwave away responsibility for adequately securing their devices.

      2. Roland6 Silver badge

        Re: Why Trev Pott is wrong - a privacy advocate's view

        Yes. If it works, if Apple can deliver then we know the iPhone 5c is vulnerable and never lived up to the claims that it would protect your privacy in the first place.

        Sorry, ANY device including the latest iPhones is vulnerable, if it permits the automatic installation of vendor signed updates. You only need to look at users experience with recent Windows updates to appreciate this simple fact.

        1. Trevor_Pott Gold badge

          Re: Why Trev Pott is wrong - a privacy advocate's view

          "ANY device including the latest iPhones is vulnerable, if it permits the automatic installation of vendor signed updates"

          EXACTLY.

          EX-FUCKING-ACTLY.

          Nail on the head. Vendor update mechanisms are a security vulnerability and all of our devices need to be designed with the idea that the vendor can and will eventually be compromised.

          No device should ever install an update against the explicit will of the device owner. Not Apple, Not Microsoft, not Google. If you choose to enable automatic updates, that's your bellyache. But if a device is configured for security - including setting it to manual updating - then that device should not be able to be updated forcibly by any party! Any attempt to do so should brick the device, period!

          Furthermore, no device should set about updating while locked. Ever. This prevents devices which are set to auto-update from being pwned by vendors if its been physically removed from the owner. Devices should ONLY set about updates when the user can be informed what is going in, and have the opportunity to cancel the updates before they proceed.

          The idea that vendors are somehow trustworthy is bogus. Vendors are beholden to the governments of the nations in which they operate and thus must be considered to be as malicious an actor as the most corrupt members of the law enforcement agencies or courts to which that vendor is subject.

          We, as customers need to understand and accept this. Vendors need to understand and accept this as well, as design their products to protect us (the customer) from them (the vendor and those who can command the vendor).

          If you don't like that, find another reality, but this is the cold hard truth of the one we live in today.

  16. Anonymous Coward
    Anonymous Coward

    Debit card numbers

    just to confuse the debate, every new debit/credit card I've had for the past 5 years has had a different number, same PIN ....

  17. cd

    How's the weather at Sunnybrook Farm this morning?

    1. Fr. Ted Crilly
      Black Helicopters

      Dont you mean hows the weather at Tarnover or Electric skillet

  18. msknight

    The law says, you shall not kill. Mitigation in self defence.. You kill anyway and you get sent to jail. It used to be death sentence, but it was public pressure that reduced that.

    The law says, you shall break this encryption.... you do it, and if you don't you should expect jail. If this discussion causes the law to change its process, then all well and good... but if it doesn't then I would expect enforcement officers to come calling for Cook.

    Corporates don't have the entitlement to be above the law. Heck, they've spent enough money to bend it to their way and we've let them do it, and trample all over it until now. Corporates aren't on our side. Ever. They're on whichever side gets money from our pockets, in to theirs... and Cook (and others) are playing on our distrust of government snooping in order to play a blinder here, and fool us in to believing that their products are secure from government snooping; because, as people have already pointed out, there are flaws in the system which, with time, the spooks could get the data out anyway.

    Sorry, but I'm not buying it. I don't have my life on my mobile. I don't use it for banking. I very rarely use it for securely accessing any web sites. If spooks are transfixed on my phone, then good luck to them.

    My privacy is guarded by me... and I don't entrust that guarding to a piece of electronics... because it's all flawed.

    1. John H Woods Silver badge

      "I don't have my life on my mobile. I don't use it for banking. I very rarely use it for securely accessing any web sites. If spooks are transfixed on my phone, then good luck to them." -- msknight

      is approximately equivalent to

      "I have nothing to say, so I don't care about freedom of speech"

    2. Thomas Wolf

      "The law says, you shall break this encryption..." - except that it doesn't. Your sentence should say: "The government says..." - and then, of course, your position has less support.

      "Corporations aren't on our side. Ever." Well, that's a nice *opinion* to have, except that corporations are on our side all the time - after all, they produce the products & services we want - I'd count that as being 'on our side'. As a matter of fact, show me a single corporation that stayed in business for very long by *not* being on our side? Governments, on the other hand, do stuff to their population that isn't in that population's best interest all the time - e.g. the Snowden disclosures.

      "Cook is playing on our distrust of government snooping in order...." It's obvious from the recent Snowden disclosures and countless historical incidents that, in fact, we cannot trust government. Not only do governments intentionally violate citizens' rights - there have been countless cases of sheer ineptitude that violated our rights. To wit, the recent hack of millions of detailed background-check records in the CIA. Those idiots didn't intentionally give up these highly sensitive documents, they were just too inept by leaving them on a network for someone to steal! So when the FBI says to Apple "this is just a one-time executable that will get thrown away when we're done with it" I just have to laugh. Yes, perhaps Tim Cook is "playing on our distrust" to highlight the security of Apple devices and sell a few more units. That doesn't mean that distrust is misplaced.

      "Sorry, but I'm not buying it. I don't have my life on my mobile..." Good for you - the rest of the world isn't living your hermit life.

  19. LDS Silver badge

    If yo can't trust your governement and the law, your f****d anyway...

    ... unless you change it.

    Face it, if you can't trust the law and those who should enforce it, then everything falls apart. In modern democracy, laws are what is protect citizens both from criminals and the authoritarian wishes of those in commands. It you let them not to work how they should work, you're really f*****d.

    Given the level of presidential candidates, it looks to me USA has far deeper problems of a locked iPhone. And that's probably just the top of a far larger pile of s**t. If the election becomes "X-President", or "Next American President", or "The Clintons", for the silly entertainment of silly TV and Youtube watchers, and the greed of media for the cash they see coming every four years, you have a far bigger problem than a locked mobe.

    If you fear any judge prosecuting true criminals (in orther to protect you) because you also fear it could also prosecute you on a whim, you, we, have a far bigger problem than a locked phone.

    Citizens should act far more broadly than trying to protect the smartphones only. If you vote a government you can't trust, it will put its hands in your data (and let the big friendly companies do it as welll) - you will never be able to protect your life by an oppressive government, and you'll never be able to protect it without the proper laws properly enforced.

    People are letting power slip out of their hands, because are too focused in what way to buy the next iGadget. They, we, are going to pay it dearly in the next years.

    1. Anonymous Coward
      Anonymous Coward

      Re: If yo can't trust your governement and the law, your f****d anyway...

      POTUS is largely a figure head for everything but starting wars (and fsck Congress for giving away that power). Executive actions create a lot more headlines than actual policy change.

  20. Anonymous Coward
    Anonymous Coward

    >"A back door in the context of encryption would be either a key escrow system or a "master key" system that would allow easy access for law enforcement into any Apple product encrypted with the back-doored system."

    No its a door that isn't the front door. Trying to narrowly define backdoors like that is just a word game.

    Ultimately, who the bombers communicated with is known in a CDR billing record on the telephone provider (and also the NSA). So this case is more about establishing the right to require a backdoor than actual investigative work.

    Microsoft sent the disk encryption keys to Microsoft servers in Windows 10. This seems to be a demand from the FBI from 2012:

    http://mashable.com/2013/09/11/fbi-microsoft-bitlocker-backdoor/#muXhWnS6w8qf

    But it means that China can demand those keys, and Russia and every other tinpot dictator can demand the keys from Microsoft. So you've made us all less safe by doing that.

    If Apple are required to install backdoors in the US, then Apple will be required to do that everywhere. Recall Blackberry and its "lawful intercept" capability? You'll do the same damage to Apple. It also establish the right for a company to be forced to backdoor their product.

    > "Its devices should not have encryption vulnerable to attack. If these devices are vulnerable to attack, then the judge is absolute within their rights to call on Apple to break that encryption."

    Now this statement I agree with. If its already backdoored (by design or negligence) then the judge can force use of that. But is it?

  21. Ilmarinen
    Big Brother

    Absolutely Not

    "the judge is absolute within their rights to call on Apple to break that encryption"

    Absolutely Not. The court is there to enforce the law. What is being suggested is that the court can force a third party to assist it in whatever way it deems necessary. And the third party is not allowed to say "sorry, I don't want to help you” nor to refuse to jump through any hoops demanded.

    Instead of being left to carry on their own business, and despite having broken no law, they must do whatever the court directs.

    Not good.

    1. Warm Braw Silver badge

      Re: Absolutely Not

      Quite.

      If there were an active case against the owner of the phone and its outcome depended on the contents of the phone, there might, under certain circumstances be scope for the court to require some basic, lawful, assistance in acquiring that contents.

      However, as far as I can tell:

      1/ There is no case, there is merely an ongoing investigation into the activities of a dead man

      2/ The court is essentially requiring Apple to act as an extension of a law enforcement agency at its own considerable cost

      3/ The precedent would be that onerous conditions could be placed on anyone by a court to assist in a criminal investigation - this is effectively conscripting a posse.

      4/ There is a strong chance the court does not in fact have the legal authority to make the order it has made.

      Apart from that, I'm sure it's just dandy.

      1. Thomas Wolf

        Re: Absolutely Not

        And what gives them "reasonable cause" anyway? The perpetrator has already been procecuted (he was killed) and there's no evidence, from what I've read, that the information in this phone would help prevent another crime/terror plot. In other words, the judge is asking Apple to help the FBI in a fishing expedition.

  22. Eugene Crosser

    Not exactly a "design flaw"

    > What appears to be involved is a design flaw.

    Not so much a design flaw, as a hardware deficiency of an older iPhone model, i.e. lack of "Secure enclave" in the model in question. This guy provides a very plausible analysis.

  23. BurnT'offering

    Nice logic

    "If these devices are vulnerable to attack, then the judge is absolute [sic] within their rights to call on Apple to break that encryption."

    So, Apple can only resist the judge's call if they can prove these devices are invulnerable. Which, of course, cannot be proved. So, they are on the hook to exploit a hypothetical vulnerability that may not exist, or may only become apparent much later.

    A lawyer might argue that: "Only if the judge can demonstrate a vulnerability that opens these devices to attack would the judge then be within their rights to call on Apple to exploit that vulnerability."

    1. Trevor_Pott Gold badge

      Re: Nice logic

      The law still places some emphasis on an individual's word. You get relevant experts on the phone to come in and say under oath that they know of no means to attack the phone. If the judge can find noone who knows how to pwn it then the device is presumed invulnerable for the purposes of that case and off they go.

      If the experts refuse to answer whether or not you can pwn the phone, of the experts are caught lying under oath, they are in Deep Shit. They'll get thrown in the clink until they are no longer in contempt of court.

      This isn't about proving a negative. It relies on the assumption that people under oath tell the truth, especially when there are more than one of them. Without that assumption our entire legal system falls apart.

      1. BurnT'offering

        Re: Nice logic

        The starting position is that Apple says it's impossible. The judge knows this. She has said, do it anyway. So she already assumes Apple is either lying or has a way to achieve the impossible. You assert that she will placidly accept expert testimony that there are no known ways to hack the phone. However, she has already taken a step beyond this and accepted the FBI's argument that it can be done. Apple must now show that doing so would be unduly burdensome, the definition of that being up to the judge. This is equivalent to having to prove one's own innocence. Given that subjectivity, the verdict is a foregone conclusion, as is the certainty that Apple will appeal all the way up the system.

        1. Trevor_Pott Gold badge

          Re: Nice logic

          Apple has said it shouldn't be done. I see nowhere they said it couldn't be done. Apple have said that there are no conventional/official means available to hack the phone. That does not mean that the method suggested by the FBI - create a compromised firmware and upload it to the device, which can apparently be done without needing to enter the PIN - cannot be used. It is simply that this is unconventional and not an official method of recovering the data.

          Apple would indeed have to be able to demonstrate why the FBI is wrong here. That's what courts are for; to allow experts to present evidence and to make decisions based on that evidence. Let's ponder some scenarios.

          If Apple had stated under oath that there is no way they know about to hack the phone and it is later proven they did know of a way, they go to jail.

          If Apple had stated under oath that there is no way they know about to hack the phone and someone comes up with a way, but it requires Apple's assistance, they don't go to jail, but they are compelled to help law enforcement do their jobs.

          If Apple had stated under oath that there is no way they know about to hack the phone and nobody can show how the phone might be hacked, then there is no reason they go to jail. They told the truth and there is no evidence to show they are lying.

          Why is that any different than you getting on the stand and saying "no, your honour, I was not at the pub that night between 11pm and 3am"?

          If you claim this and you're proven to be lying (say by video evidence) you go to jail.

          If you claim this and nobody can prove that you were there nothing happens.

          Apple is not required to prove their innocence. They are asked to cooperate in an investigation. A means by which they can do so has been suggested. If there is some reason they cannot comply with that means they have to demonstrate why that is.

          What is wrong that? How is that morally or legally wrong?

  24. ShortLegs

    From what I can gather, Apples stance is not so much motivated by the warrant per se, but by the use of the Act that the FBI used to obtain it; the Act of Free Writ circa 1780. Arguably, this Act was never intended to have this scope, at the time limited to granting power of entry to named officials without, warrant. It was one of the contributory factors that led to the colonists seeking independence.

    To those posters who argue that security trumps privacy, does it? Did we need to forego privacy during the 70s and 80s mainland PIRA bombing campaigns? The state has consistently shown that it cannot be trusted - witness fabrication of evidence in numerous trials, withholding of evidence, outright lying to Parliament, attempts at removing the democratic check and balance such as the House of Lords and the power of the Legislature to counter the Executove, and muzzling the free press - so why should we forego a basic freedom, one enshrined in the ECHR?

    1. KeithR

      " It was one of the contributory factors that led to the colonists seeking independence."

      Oh.

      The.

      Fucking.

      Irony...

  25. chris 17 Bronze badge

    Oi Trev, ave a read of this

    https://www.apple.com/business/docs/iOS_Security_Guide.pdf

    then re write the article including what you've learnt about the lengths apple has gone to prevent this kind of state bullying to access users data

    TLDR: the only possible way would be for apple to somehow replace the IOS on that device with a compromised version that permitted brute force of the passcode. In other words that would mean developing a copy of the OS with a backdoor to achieve the brute force. The current version of IOS on that phone is not susceptible to such an attack in a timely manor, would take at least 1 year if the default option to erase all data after 10 attempts has been deliberately disabled by the phones owner before locking, else the phone data would be unrecoverable after 10 attempts.

    1. Trevor_Pott Gold badge

      If the phone can have a compromised firmware uploaded without the phone owner's permission it is flawed. At which point, we should all know this so that we can bin the units and seek adequately protected units from someone else.

      If the phone is not susceptible to any attacks that Apple's own experts are aware of, then they get to take the stand and claim under oath "they know of no means to meet the judge's order" and we all get to cheer in the streets.

      Either they can do what the judge wants and thus the phone is flawed, or they can't and the phone lives up to the hype.

      If the phones are designed properly then Apple should not be able to pwn one of their phones to comply with this sort of request. Quite frankly, that is the only adequate protection we will ever have. Hiding behind a technicality of the law will never be good enough.

      1. steeple

        Sorry Trevor, it's not "flawed" as much as lacking in some functionality. A car built before ABS did not have flawed brakes. The patch for this is called the Secure Enclave, by the way. Being a hardware fix, however, it is difficult to roll it out over the internet...

        Also, "hiding behind a technicality of the law" is what keeps many innocent people out of prison.Holding the technicalities of the law in contempt is simply silly.

  26. This post has been deleted by its author

  27. Boris the Cockroach Silver badge
    Big Brother

    I read this slightly differently

    iApple device has data on it law enforcement wish to recover,

    They goto court, and get a judge to say "This phone needs to be opened, get the manufacturers to help you open it"

    Which is entirely different to the usual way of the 3 letter agencies working to wit... installing a bunch of malware on everyones phones without permission in order to data slurp the entire interweb with the hope of finding the one fool who types "nuclear, jihad, tommorrow, bomb, London, 9.15am"

    Be right back.. someone is at the door.

  28. Neil Alexander

    Re: "Either it is possible to load a compromised firmware into the phone"

    When you have ultimate freedom to cryptographically sign whatever firmware you want for the hardware in question, of course it's possible.

    1. Trevor_Pott Gold badge

      Re: "Either it is possible to load a compromised firmware into the phone"

      Then we have a problem, as the phone is vulnerable to attack by the bad guys. The devices need to have a means to be made immune to this attack vector. It's as simple as that.

      1. chris 17 Bronze badge

        Re: "Either it is possible to load a compromised firmware into the phone"

        @ Trevor_Pott

        only the 5c is vulnerable to this attack, later models are currently known to be immune as the dedicated chip controls number and rate of attempts.

  29. dmdev

    True, Apple can comply and fix the flaw for future releases, as long as they fix the flaw.

    However the fundamental problem is simple for law enforcement to solve: don't shoot-to-kill suspects/criminals, and you have legal means at your disposal to unlock the phone.

  30. Asterix the Gaul

    Since when has any 'judge' been enpowered to make the law,when it's their job to judge 'only',whether or not the case may be,if someone has broken the law.

    This 'judge' cannot be acting within the law by attempting to coerce a felonious act by any body,whether a person of other body.

    That he is trying to do so on behalf of a goverment agency,he is both acting as their proxy by prosecuting an attempted 'hack' ,he is also setting himself up as judge & jury on matters outside of the case in hand.

    That the judge believes that it's possible to 'hack' the device in the first place,flags up potential weaknesses in the hardware , the limits of goverment agencies abilities to overcome such 'weaknesses' & their willingness to use any method to attempt such a task.

    It is incumbent on ALL citizens to deny ALL governments the potential to breach their security\privacy.

    The UK government's absolute disregard for our privacy from their prying, is a double-edged sword that will destroy them.

    Every citizen that is the victim of such acts should resort to the courts,even when the legal system,such as in the UK is now fundamentally flawed through the abolition of the House of Lords as the highest court in the land,replaced by a 'political' Commissars court, AKA the 'supreme court' of idiots.

    1. Doctor Syntax Silver badge

      "Since when has any 'judge' been enpowered to make the law,"

      No, they do that regularly. It's the basis of Common Law in England and all other countries which follow that principle. It's statute law they don't make.

      "when it's their job to judge 'only',whether or not the case may be,if someone has broken the law."

      Again, only partially correct. At the magistrate's court level (or whatever the equivalent may be in other jurisdictions) yes. And, as we're dealing with terrorism, in the "Diplock" courts in N Ireland*. But in jury trials it's the jury's job; the judge's job is to explain the law to the jury.

      *In those courts the judge had to do something no jury is called on to do: explain the reasoning by which they arrived at the verdict.

  31. Anonymous Coward
    Anonymous Coward

    What really makes me sad is not that they want these privacy invading back doors it's that even when they have all the information they want they still can't catch the bad guys. Look at how much data the US and US have slurped over the last 10 to 15 years and what have they managed to do with it? It's certainly possible they may have stopped things that we don't know about but my guess would be if they had managed to stop something significant they'd be shouting it from the roof tops.

    I struggle to think of any significant crime being prevented by snooping. We seem to have caught a few idiots that were probably as much danger to themselves as anyone else but it's not clear if that wasn't just regular police work. We certainly haven't managed to stop events like the shootings in France.

  32. stucs201

    Are there any precedents with other forms of security?

    Take, for example, suspected evidence locked in a safe. Would the safe manufacturer be required to assist in opening it without destroying the contents? Surely safes are old enough that this might have occurred in the past, surely the same rules should apply here to breaking electronic rather than physical security (and I'll admit to ignorance as to what those rules would be).

    1. Charles 9 Silver badge

      Re: Are there any precedents with other forms of security?

      That's why they employ safe crackers. Anyway, the analogy is flawed. More accurately would be to say they're trying to retrieve contents from a booby-trapped safe rigged to blow if it's opened by any way other than the combination. Only problem is the only person who knows the combination is dead and the booby-trap has a fail-deadly vigilance check. If it's not opened within a few days it blows itself up. And yes, if I recall, it IS possible to build a one-way fail-deadly mechanism where the only way it'll resolve is by exploding. One such device was set in a casino a few decades back.

  33. Florida1920
    Black Helicopters

    An easy way to break encryption?

    If they have the encrypted text and the unencrypted plain text, do they not then have the key to reading all similarly encrypted texts?

    Anyway, I think the Feds are only using the horror of the San Bernadino shootings to mask an invasion of civil rights, the way 9/11 was magically transformed into a reason to invade Iraq, etc. Terrorists come and go, but repressive governments are harder to shake off.

    1. Charles 9 Silver badge

      Re: An easy way to break encryption?

      "If they have the encrypted text and the unencrypted plain text, do they not then have the key to reading all similarly encrypted texts?"

      Not necessarily. What you describe is a form of Known Plaintext Attack:

      - Given X and X', find Y such that E(X, Y) = X'

      A good cipher tries to make that problem difficult to solve.

  34. Matt Bryant Silver badge
    Meh

    Yawn.

    Article filed under Correspondent for Paranoid Conspiracy Theories & Hating Amerikkka.

    All Cook is really worried about is having to admit Apple can defeat their own security pretty easily. The ruling has exposed the flaw, now every secret squirrel organisation and black hat on the planet will be racing to find a way to exploit it. Cook's only interest is in trying to stop another nail in the iPhone's slowing sales (http://www.theregister.co.uk/2013/07/18/peak_apple_iphone_sales_slow_down_across_the_globe/).

  35. Bill Stewart

    FBI's been trying to get this for a while

    This didn't start with the San Bernardino shooting - the FBI's been running court cases for a year or more trying to force Apple to do the same thing in drug cases. If they succeed, they'll end up with a tool that lets them inspect anybody's iPhone, without needing warrants, as long as they've got the phone.

    And cops have been getting away with confiscating smartphones from people they stop, also often without warrants, and they've especially been doing this in protest arrests, though at least they're starting to get some pushback from judges.

  36. Anonymous Coward
    Anonymous Coward

    Apple's cooperation

    Certainly there's no chance that Apple could make an error if it indeed was coerced into attempting to provide some assistance in whatever form. Where would that leave them?

    Spin up the conspiracies.

  37. Anonymous Coward
    Anonymous Coward

    USA and the Americans and their governments, damned if you and damned if you don't -

    freedom and privacy rights comes with privileges and responsibilities. US govt is trying to walk on this tightrope of protecting its people without risking getting labeled anti-privacy rights. Tech giants such as Apple, Facebook, Google enjoys the enormous profits from the public. The same very public that US govt is responsible to protect.

    I would subpoena the leaders of these rich tech giants and impose them daily fine of millions of dollars if they refuse to cooperate on protecting the overall goal here wihch is protect the NATIONAL INTEREST OF THE USA. The profits and tax-sheltering of these tech giants comes second above national security. otherwise, the public should boycott them. period.

    1. Doctor Syntax Silver badge

      "the overall goal here wihch is protect the NATIONAL INTEREST OF THE USA"

      And what, exactly is that? Is it the same today as it was yesterday? Will it be the same tomorrow? Or after a change of government?

  38. TaabuTheCat

    What's next?

    Apple claims they can't do it - technically impossible. Judge says, "Hmmm, these really smart guys (much smarter than your guys) from the NSA think you must have missed something, and they'd like to have a crack at it. Mind sending over all the source code, chip lithography, design docs, and a few of your best engineers to assist? On second thought, can you just set up an office for them there at HQ? This may take a while."

  39. JeffyPoooh Silver badge
    Pint

    You only have to spend an evening on CCC.de media site...

    If you watch the Chaos Computer Club presentations / educational hacking and cracking videos for an evening, your faith that Apple (or anyone else's) encryption is secure will be badly shaken.

    The hackers and crackers have better imaginations than you or I.

  40. Bota
    FAIL

    Why Tim Cook is wrong: A total shills view

    FTFY

  41. This post has been deleted by its author

  42. John Savard Silver badge

    Not Concerned Here

    I don't consider it a 'design flaw' that the limits on entering PINs into phones are only enforced by software - which is on chips that can be removed from the phone and replaced by chips programmed differently. An iPhone is not a piece of military communications equipment that may be captured in hostile territory.

    Since the authorities have obtained physical possession of the phone in question in a legal manner, however, I do not see the judge's order as a threat to privacy. It is a threat when a backdoor to encryption lets the authorities eavesdrop on people who have no way to know this is happening - but when proper procedures have led to a computer being seized, that is a very different matter.

    1. Charles 9 Silver badge

      Re: Not Concerned Here

      The problem is that ANY means found to get around the lock would be considered worth more than its weight in Bitcoin. Miscreants will be dying to leak this knowledge out and make it work in the general case.

      1. Roland6 Silver badge

        Re: Not Concerned Here

        The problem is that ANY means found to get around the lock would be considered worth more than its weight in Bitcoin. Miscreants will be dying to leak this knowledge out and make it work in the general case.

        There is sufficient information out there for 'miscreants' to investigate. The only real hurdle would seem to be cracking Apple's code signing certificate.

        As for the general case, well given that iOS updates are already built for the general case (it doesn't make sense for Apple to do otherwise), this isn't a hurdle. However, I expect using a process that is designed for the general case to produce a build that can only be used on a specific device is a problem Apple are grappling with - basically, they realise that the simplest way to comply with the FBI's request, is to produce something that can be used in general...

  43. Anonymous Coward
    Anonymous Coward

    They can search your property based on a valid probable cause warrant as much as they want like say in the Making of a Murderer to solve a crime. But a third party's refusal with wherewithal to assist in solving a crime (in this case an issue of National Security) is far beyond obstruction of justice. Apple should be made to pay for their refusal. The Government is not asking for building a back door in all phones, even though they will like to. They are asking for cracking mass murderers' one phone.

    This is mockery of Government power.

    1. Anonymous Coward
      Anonymous Coward

      Apple counters it's the Snowball Effect. If they make a way to break into ONE phone, they necessarily make a way to break into ALL of them, thus making ANY attempt to invoke a reasonable search of their software a breach of the Fourth Amendment.

  44. strum

    Imagine

    Imagine that this phone was in the hands of Mr. Farook's friends/family. They'd like to know what's in it, so they hire a very clever cracker to get into it. And, what do you know, there's evidence exculpating Mr Farook from guilt (or there's evidence showing FBI/CIA/NSA sponsored him).

    Do you think the FBI would accept that evidence? Or would they insist that a cracked phone could not be entered as evidence, since no-one could be sure what was added/subtracted during the crack.

    So, why would we accept any evidence from a phone cracked by Apple/FBI?

  45. Anonymous Coward
    Anonymous Coward

    So a solution is for Apple to patch the target phone as requested which shut's the FBI up, then immediately issue an update for the 5s which renders it unhackable by the method used on the target phone & preferably by any method.

  46. InfoSecuriytMaster

    Apple is not being asked to install or utilize a backdoor. Apple is being forced to turn iOs into a Backdoor. Or better, the remove security from iPhone. Upgrade becomes a degrade. Now upgrades and patches are actually the FBI trojan. So now no one can trust Apple's upgrade/patch. And next, MS and Google. So now we, because of the FBI, are tossed back to 1984 technologically - and politically.

  47. Anonymous Coward
    Anonymous Coward

    Authorities have ever right to unlock the phone of a criminal

    Apple won't win this one and they shouldn't.

    1. Charles 9 Silver badge

      Re: Authorities have ever right to unlock the phone of a criminal

      Not if it causes collateral damage. If the means to unlock the criminal's phone necessarily unlocks everyone else's phones (and the design of it may well make this part and parcel), now you have the Fourth (search and seizure) and Sixth (presumption of innocence) Amendments to contend with.

  48. gtarthur

    It was not a personal phone

    In the rush to marshal all the forces of good and evil, the main combatants have overlooked a glaring mistake by the terrorist's employer. The local government agency issued him a "company" cell phone without a mobile device management, MDM, agent installed that would allow the local government to take control of the phone. He didn't buy or lease this cell phone. It was provided to him, ostensibly, as part of the responsibilities of his job. Although in many agencies and companies, cell phones are merely perks, and the resulting lack of proper administration is clearly obvious in this case. The use of an MDM agent wouldn't have broken Apple's excellent encryption regime, nor violated the privacy of the phone's "owner" - because the owner was not the user, it was the local government. Once again lack of foresight and planning have created a social dilemma that could easily have been prevented in this case. As for privacy versus security, I remind everyone that each of us has the primary responsibility for our own privacy, and the first lesson is to keep it to yourself. Your phone is not an appendage.

  49. PghMike

    Apple is mostly right

    You portray Apple's position as black and white, but it really isn't. Apple is being asked to spend their own money breaking into their own OS. No matter what they do, there'll always be *some* attack that can work against even future phones, even if it requires taking the phone apart atom-by-atom.

    Apple is saying "No, we won't do this," and wants to stop now, even though the costs are probably not prohibitive today for a single iPhone 5C.

    In other words, Apple doesn't want to be ordered to spend their own money to subvert their own security. It will *never* be *impossible* for them to get keys out of a phone. But it will be increasingly expensive, time consuming and likely have an increasing likelihood that an error will accidentally destroy the data on the phone.

    1. Charles 9 Silver badge

      Re: Apple is mostly right

      "You portray Apple's position as black and white, but it really isn't. Apple is being asked to spend their own money breaking into their own OS. No matter what they do, there'll always be *some* attack that can work against even future phones, even if it requires taking the phone apart atom-by-atom."

      Even if there's a self-destruct mechanism? There IS such a thing as a one-way mechanism, meaning one CAN physically render a package impossible to open without destroying the contents first.

  50. This post has been deleted by its author

    1. Charles 9 Silver badge

      You can't do a diff against encrypted contents since with proper encryption a change of a single bit will propagate throughout the entire image.

  51. Rasslin ' in the mud

    What will Tim Cook & Co. do when the terrorists shoot up Cupertino?

    One has to wonder how firm Timmy's stance will be when it's Apple's own employees and facilities which have been attacked and there is an iThing left at the scene which may expose more information about the perpetrators. In other words, I suspect the strength of his morality is relative to how close the bullets fly to his own sorry butt.

    1. Anonymous Coward
      Anonymous Coward

      Re: What will Tim Cook & Co. do when the terrorists shoot up Cupertino?

      They could just as easily have done their coordinating in person in dark, smoke-filled dance clubs where you it's hard to read lips or use a mike, in telephone conversations about football teams or their families (so tapping won't reveal anything), or through classified ads concerning specific things for sale. So long as you can solve the issue of First Contact, you can establish a means to have conversations that are damn hard to pick up, especially if the law isn't yet tipped off. Think about how the 9/11 people worked; they KNEW about the law's ability to bug, so they went low-tech to sneak under it all.

  52. Anonymous Coward
    Anonymous Coward

    Protecting my privacy by corporations in this country is a farce. The banks bend over backwards to hand over all my records. So do all other financial institutions. The credit reporting companies are collecting all kinds of data about me. Merchandisers are profiling me all the time. Google is tailoring ads to push my way, so does Facebook. They all have horrendously long Terms of Agreement that no one can afford to read. Privacy settings are so complex that you have to be a computer wiz to get them right and you will still miss some deep buried setting. So I do not understand who Apple is trying to protect - me or criminals or their business brand. Not cooperating with the Government to help fight crime is totally un-American. Apple will not be what it is today without the Government providing various kinds of security and protection.

    Apple is wrong and needs to learn, I hope not the hard way, that privacy without security does not mean a thing. If US government were to become a police state, their iPhone cannot save the nation.

  53. Anonymous Coward
    Anonymous Coward

    It's all a matter of Trust

    "Our law enforcement cannot be trusted, and neither can our government."

    This seems to be the root cause of most issues in America, even when elected via a democratic process (Gun Control being a prime example). Maybe the US need to address this issue of Government and trust before they can agree on anything else!

  54. IT Hack

    Data

    I would not be surprised if in fact there is absolutely bugger all on that phone.

    Just yet something else caught up in the dragnet that we call modern law enforcement.

  55. Gigabob

    There are Backdoors - and Backdoors

    I accept that "Apple is not being asked to create a backdoor" based on the very narrow definition of defeating the encryption scheme with a "One Ring" Master Crypto. They are being asked to build a mechanism to interrupt the internal boot sequence with an external loader that will divert and reset internal processes. As the FBI and AAPL are both aware - such a defeat would get into the wild and render all pre iPhone6 systems susceptible to a similar defeat.

    The good news for AAPL is that this should send anyone with a 5 series or below iPhone to migrate much more quickly to version 6.

    "Don't buy American" is slightly flawed. I am sure you can get a secure Chinese phone instead, Not!

  56. Anonymous Coward
    Anonymous Coward

    but who would build it?

    If the Apple engineers who would have to write and build the insecure software are the same ones who have just spent the last few years designing software to be unbreakable, then how many are going to cooperate with the FBI? How many would check their employability with Google and Facebook first? How do you make an unwilling engineer put bugs in the software he has lovingly crafted?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020