back to article Apple CORED: Boffins reveal password-killer 0-days for iOS and OS X

Six university researchers have revealed deadly zero-day flaws in Apple's iOS and OS X, claiming it is possible to crack Apple's password-storing keychain, break app sandboxes, and bypass its App Store security checks. Attackers can exploit these bugs to steal passwords from installed apps, including the native email client, …

  1. msknight
    Trollface

    Come on...

    "Apple was not immediately available for comment." ... someone pointing this out has to be the top comment on any El Reg Apple report :-D

    1. diodesign (Written by Reg staff) Silver badge

      Re: Come on...

      Apple PR thinks that if they ignore us, we'll go away. They are wrong.

      C.

  2. bazza Silver badge

    Whooooops!

    That is all.

    1. nematoad

      Re: Whooooops!

      Apple stuff is supposed to be idiot-proof and I suppose it it is when used by idiots.

      Unfortunately when used by anyone with even half a brain it appears to be wide open to abuse. And not to even respond to the reported threat is a real abdication of any duty of care to its users.

      Not a nice prospect for those naive users who trust Apple implicitly.

      1. Lee D Silver badge

        Re: Whooooops!

        Apple currently still has, on its app store, an app expressly stating that it is intended to be used to "bypass your school filter", etc. It's as simple as installing it, and you get full, free, VPN access to the outside world that's almost undetectable.

        Not a huge issue, but there are no real ways to "block" a particular app install even with MDM APIs. You can turn app installs on or off and monitor them, but you can't blacklist an app. If you want to use, say, Cisco Meraki to push apps to your iPads in a school (a very popular choice) you need to have the "install apps" option on, or else you have to manually recall every iPad and do it manually every time.

        The only real option you have is parental filtering, where you can block apps with certain age ratings.

        The above app is STILL, after several reports, marked as being 4+. Apple have steadfastly refused to do anything about it, as they categorically state that it's nothing to do with them and it's up to the app-makers to decide the age-rating (not much point in having an age-rating, then, really?). This app allows bypass of any and all filters and access to the unfiltered Internet, for free, just by clicking "Get App".

        However, Chrome was briefly pulled from the store and recategorised as 18+ because it "allows unrestricted access to the Internet".

        Apple don't care about what they are doing, so long as they are making money. They are right and everyone else is wrong, and that's the end of it. And no amount of head-banging against their complaint department, tech support, etc. will do anything to change that at the moment.

        1. roninway

          Re: Whooooops!

          What's the name of the VPN app buddy?

          1. Anonymous Coward
            Anonymous Coward

            Re: Whooooops!

            The name of the VPN app : which one ? There are plenty on the app store, the one I use (VyprVPN) also has the 4+ rating. Of course, the apps are free, the subscription to a VPN service usually is not. But it's only that : a VPN profile. Browsing, mail, other apps : they use the VPN connection for access, nothing more. So any age limits on apps using that connection stay in place.

        2. Daniel B.
          Boffin

          Re: Whooooops!

          Apple currently still has, on its app store, an app expressly stating that it is intended to be used to "bypass your school filter", etc. It's as simple as installing it, and you get full, free, VPN access to the outside world that's almost undetectable.

          If your school system can't stop VPNs, you're doing it wrong. Pretty much any corporate network I've had to plug into has blocked pretty much all VPN connection methods. Some proxies are even smart enough to detect "SSL" connections that have been transferring far more data than what a regular HTTPS request would require and cut off those connections.

          1. bigtimehustler

            Re: Whooooops!

            Haha, good luck with that SSL method when the whole of the web goes SSL which is slowly happening.

            1. Preston Munchensonton
              Boffin

              Re: Whooooops!

              Haha, good luck with that SSL method when the whole of the web goes SSL which is slowly happening.

              Actually, that won't matter, since the method in question allows the proxy to act as a man-in-the-middle to decrypt the connections, inspect the contents, and reencrypt the HTTPS connections. They get away with this through the use of an internal CA pushed by default to all internal systems, such that the proxies are always trusted, even when they impersonate external HTTPS sites.

              Evil. Pure evil.

          2. Anonymous Coward
            Anonymous Coward

            Re: Whooooops!

            "Some proxies are even smart enough to detect "SSL" connections that have been transferring far more data than what a regular HTTPS request would require and cut off those connections."

            So what happens when a false positive complaint appears, such as someone trying to download a perfectly-legal Linux Live ISO? Plus I would think it would have a hard time trying to handle smurfed sessions or ones limited to small things that are plausible under normal web use.

  3. Mystic Megabyte
    Trollface

    Patent

    Those researchers should patent XARA as "a method for sharing stuff" then sue Apple for using it.

    1. Preston Munchensonton
      Coat

      Re: Patent

      I'm guessing that Apple's defense will be that the researchers were holding it wrong.

  4. Anonymous Coward
    Anonymous Coward

    SIx months??? Apple was lucky it wasn't Google to find them...

    ... and even if it was Google I'm sure it would be very cautious to put its usual silly deadline, and release informations about vulnerabilities of systems it uses itself...

    1. Charlie Clark Silver badge
      Thumb Down

      Re: SIx months??? Apple was lucky it wasn't Google to find them...

      So, the note about the Chromium team disabling the affected part escaped your notice?

      You need bright light to find bugs. Delaying publication does not really improve security. Who's to say that other people (criminals, spies) haven't found the same vulnerabilities?

      1. sabroni Silver badge

        Re: Delaying publication does not really improve security.

        But neither does exposing updates too soon. You don't have to be smart enough to find a vulnerability if you can see what's changed in the latest patch and work back from there. And while some bugs are just simple errors that can be easily fixed others require core components to be redesigned and rewritten. Applying the same time scale to all vulnerabilities shows a lack of understanding of software development.

        So it's a balancing act, allowing the vendor time to fix things in a timely manner is fine. Waiting six months seems a too generous to me though....

        1. fajensen

          Re: Delaying publication does not really improve security.

          Waiting six months seems a too generous to me though....

          Government bureaucracy - Apple have to wait with the fix until the NSA comes up with a workaround.

          1. Anonymous Coward
            Anonymous Coward

            Re: Delaying publication does not really improve security.

            Enough. I'm absolutely sick of this bullshit. The way you fucking morons blather on is ridiculous. If you are a Goigle or Microsoft advocate, just shut the fuck up. Those to are demonstrably more complicit, despite Larry's protestations, than Apple. Open Sourcer? The code is fucking open!!! Do you really think that penetration of your systems is beyond the bods at the NSA's capability? Hubris is going to get the better of you and I for one cannot wait until it does you bunch of sanctimony pricks.

            And breathe...

      2. Anonymous Coward
        Anonymous Coward

        Re: SIx months??? Apple was lucky it wasn't Google to find them...

        LOL! It's astounding how MS is evil, Apple and Google always right... MS has to fix everything in 90 days, Apple can ask for 180 (and is this fixed?) and nobody complains... if the vuln was already known in October 2014 and Chromium fixed it only recently, it took more than 90 days too... it's always easier to apply hard schedules to others than ourselves, right?

        Delaying publication until a fix is ready - and the vendor is working on it - is a good thing - sure, somebody else can have found the same vulns, but maybe not. If you publish them, you ensure any criminal knows it and can use it easily. The day a vuln will hit you hard in the face, you'll change your mind...

        1. Charlie Clark Silver badge
          Thumb Down

          Re: SIx months??? Apple was lucky it wasn't Google to find them...

          LOL! It's astounding how MS is evil, Apple and Google always right.

          What a load of crap! Time to burn your strawman!

          Apple is known to have a terrible record on security updates. That's why many of those who use Macs don't really on Apple for POSIX libraries. Interestingly, however, it looks like they have learned from the openssl debacle and are moving to libressl for the next version.

          Google might well want everybody's data but does have a good track record when it comes to bug-fixing. This may come from having a pretty good open source culture within the company: they have long been good players in many projects. The proof will, of course, come when someone discovers a major flaw in something like Android that they want holding back.

  5. Pascal Monett Silver badge

    What are all these papers good for ?

    Not knocking the work, and certainly not the results, but when these guys say that this research will be invaluable for future reference, is that really the case ?

    We all know about buffer overflows, yet that door is still open in almost every new malware report. Sometimes they even concern products made by big companies who definitely know better.

    This new report is bringing to light some new obscure chain of consequences that constitute a vulnerability. Great news, but who exactly is going to pore over this to understand what is going on and how to avoid it ? Security researchers, not application coders.

    When I search for "good programming practice", what I find is stuff that generally concerns code clarity and maintainability, rarely security.

    In the best case, there will be a mention of using fgets instead of gets in C, because buffer overflow. But the rest is all about indenting, variable name formatting, function wrapping and commenting. Nothing to do with security.

    We need an easy-to-read overview of good security practices that does not just say "check your inputs" but details what to check and how to make sure. Is that available somewhere ?

    1. Charlie Clark Silver badge

      Re: What are all these papers good for ?

      True, but I'm not even sure if this that much about programming. It sounds a lot more like design, especially Apple's much flaunted app sandboxing that seems to have been undermined.

      1. Michael Wojcik Silver badge

        Re: What are all these papers good for ?

        True, but I'm not even sure if this that much about programming. It sounds a lot more like design, especially Apple's much flaunted app sandboxing that seems to have been undermined.

        Have you read the paper? It does discuss specific issues for app developers, even though the general problem probably can't be solved entirely at the app level.

        In any case, saying this sort of research isn't useful for programmers is like saying research into the performance of building materials isn't useful for house builders. Yes, programmers are able to continue writing crap code. That doesn't mean it's impossible for them to learn to do better.

    2. This post has been deleted by its author

    3. Adam 1

      Re: What are all these papers good for ?

      There is a relationship between code clarity and security. Case in point, Google GOTO fail bug, the one which borked SSL on osx or ios. Had the code been formatted correctly, it would have been very hard to miss that accidental GOTO fail line.

    4. Brewster's Angle Grinder Silver badge
      Boffin

      Re: What are all these papers good for ?

      The paper does criticise Apple for not making developers aware of the vulnerabilities or providing ways to spot them.

      But, skimming the paper, I saw two classes of problems:

      1. IPC is public. You'd be castigated for allowing access to private website data without forcing the user to login. The same applies to an app's internal services: if the service allows an app to change something or read sensitive data, then verify the caller is who they say they are. This includes communication via any url-scheme; so, for example, if your app can be reached via anglegrinder:param1&param2&etc then any tom, dick or malicious app could do so. Also websockets, etc...

      2. Impersonation. It's possible for one app to impersonate another. (They can register your keychain id and then steal your data. Or register your url scheme and intercept data before it gets to you.) The fixes for this are dependent on Apple, and will probably break apps. But avoid keychain until Apple have sorted it.

      There wasn't a buffer overflow insight. These weren't programming errors, they were design errors. And for those of us who have been around the block, mitigation is plain common sense (AKA EXTREME PARANOIA).

    5. Michael Wojcik Silver badge

      Re: What are all these papers good for ?

      We need an easy-to-read overview of good security practices that does not just say "check your inputs" but details what to check and how to make sure. Is that available somewhere ?

      All over the place.

      If you're programming in traditional procedural languages, try Howard et al., 24 Deadly Sins of Software Security. Originally 19 Deadly Sins.... I think the first edition came out in 2005, so it's been available for the past decade.

      Organizations like SANS and OWASP have been publishing "top ten" vulnerability lists for years. The main SANS list goes back at least to 2000. The OWASP list is specifically for web applications, though some of the concepts are applicable elsewhere. OWASP has a good wiki and other materials that describe specific remediation steps. There are many, many online articles that discuss these lists and remediation steps for the vulnerabilities they describe.

      There are the Security Focus mailing lists. Bugtraq is the most famous, but they have a "Security Basics" list, and in the early 2000s there was a "Security Programming" (SecProg) list; the archives are still available at securityfocus.com/archives, along with those for VulnDev and others. Back in the day there was plenty of activity on Usenet groups like comp.unix.security.

      And of course there are any number of more-general treatments that will actually teach developers how to think about security and develop with it in mind, rather than simply following a list of rules. There's the O'Reilly Computer Security Basics book (Russell & Gangemi), for example, or Anderson's Security Engineering - which is available free online.

  6. cyke1

    they told you to wait 6 months? IS that a JOKE? Seriously i would tell them they got 1 and its made public. Reason for that is Apple is Terrible at fixing flaws in their crap. Make them get off their butts and fix it

    1. sabroni Silver badge

      And the difference between that and extortion is that you're a good guy?

      1. Anonymous Coward
        Anonymous Coward

        Extortion would be saying that they need to pay $1million by midnight or you'll release the code tomorrow.

        1. cyke1

          Give the problem with this flaw, Apple shouldn't need 6 months hell they knew about it for 9 months now still hasn't fixed it. Reason you say 1 months is to make Apple get off their butt and FIX IT. Problem is Apple has a nasty problem of NOT fixing stuff in a timely manner EVEN with ALL THE MONEY they make takes them months on end to fix an issue.

          1. Anonymous Coward
            Anonymous Coward

            Problem is Apple has a nasty problem of NOT fixing stuff in a timely manner EVEN with ALL THE MONEY they make takes them months on end to fix an issue.

            I don't see evidence of them NOT reacting to issues, but it is true that they sometimes take their time. The OpenSSL and bash bugs were nailed pretty quickly though, so I wonder why they took this long. Maybe the issue is too complex to patch quickly? It would be interesting to know.

            1. cyke1

              I never said they haven't reacted to it, i said they have an issue getting things fixed in a timely manner. There was a flaw called flashback i think it was many years ago. Was a java exploit, 0 day bug that windows fix was out within a day, Apple had the updated code to fix it as well but took then 2 months before releasing the fix. Apple has a habit of taking a LONG time to fix nasty security flaw's. Its so bad that it would be easy to say Windows is 10x more secure then and Apple's OS's just on fact MS fix's things in a reasonable amount of time, where as apple you can't expect it to be fixed for least 2months if not more.

        2. Anonymous Coward
          Anonymous Coward

          Extortion?

          In British law, blackmail or, if money requested, demanding money with menaces.

          1. Anonymous Coward
            Anonymous Coward

            Re: Extortion?

            a criminal offence of obtaining money, property, or services from a person, entity, or institution, through coercion.

      2. This post has been deleted by its author

      3. Spasticus Autisticus
        FAIL

        A car with a serious flaw in its operation would be recalled and be fixed by the manufacturer or supplier in a time scale commensurate with the scale of the danger. If the scale of the fault is as great as losing passwords then a fast fix has to happen - or it might be better to turn the product off and not use it again until it is fixed.

        1. Lallabalalla

          A car with a serious flaw in its operation...

          would be recalled and fixed... sometimes.

          Sometimes, in the past the mfr has decided its cheaper to settle the surviving family members' lawsuits than fix the flaw. Or just ignore the issue like with diesels' fuel filters, or VW Touran ABS modules and flywheels falling apart.

        2. Anonymous Coward
          Anonymous Coward

          not true

          Just look up GM and ignition switches. Oh, and because people died when they were still in bankruptcy protection they aren't financially liable for the deaths.

  7. Charlie Clark Silver badge

    Journalism 101

    The article is generally better than Mr Pauli's dashes but still contains some misleading and poorly expressed parts. For example,

    They found "security-critical vulnerabilities" including cross-app resource-sharing mechanisms and communications channels such as keychain, WebSocket and Scheme.

    In this context "security-critical vulnerabilities" should not be quoted because it is in the context of the report. If the author wants to emphasise that this is a claim made by the researchers that has yet to be confirmed then more explicit context can be added: "the researchers claim that there are security-critical vulnerabilities…"

    Resource-sharing is essentially what an operating does for applications and is always "cross-app". However, this sounds more like it is related to resources being shared between apps.

    "Scheme" is a programming language, LISP like as far as I know but I'm probably wrong. Further down in the report this is clarified as referring to the URL-scheme used and not the programming language. BID is thrown in later without explanation of the acronym.

  8. Dan 55 Silver badge

    Don't think Apple's got it where proper OS design is any more

    XProtect and app signing both work on filename metadata, if you strip it with a single xattr command you've lost protection.

    Apple didn't backport the root pipe vulnerability to Mavericks or lower and the fix for Yosemite didn't really address the issue.

    https://reverse.put.as/2015/04/13/how-to-fix-rootpipe-in-mavericks-and-call-apples-bullshit-bluff-about-rootpipe-fixes/

    Yosemite networking is a disaster thanks to discoveryd.

    Security fix policy is what we know by what's been updated rather than any official statement.

    Now Keychain was owned six months ago and they've been unable to fix it.

    It has rather gone downhill since Snow Leopard.

    1. Sgt_Oddball

      Re: Don't think Apple's got it where proper OS design is any more

      So does this mean we're heading towards a moment where "Le gasp" windows might, possibly, at the very edge of reason be more secure than osx?

      Who'd've thunk it.

    2. Anonymous Coward
      Anonymous Coward

      Re: Don't think Apple's got it where proper OS design is any more

      No, it has gone downhill since Cook took over.

    3. Charlie Clark Silver badge

      Re: Don't think Apple's got it where proper OS design is any more

      It has rather gone downhill since Snow Leopard.

      Snow Leopard has enough problems of its own due to the switch to 64-bit…

  9. Anonymous Coward
    Anonymous Coward

    Here we go again

    In response to all these security bulletins I created a special VLAN at work for our insecure devices.

    I called it "LAN" and then wept.

  10. Anonymous Coward
    Anonymous Coward

    But shirley...

    ... if you have 2 processes running under the same user id on a system, then 1 process can attach to the other and scan its memory anyway. Which admittedly requires a large amount of knowledge of unix systems programming but the potential is there. How is this different other than some badly written libraries make it slightly easier?

    1. Anonymous Blowhard

      Re: But shirley...

      "if you have 2 processes running under the same user id on a system, then 1 process can attach to the other and scan its memory anyway"

      No, modern (post 1990s multi-user system) operating systems should manage the memory space for applications to prevent this.

      1. Anonymous Coward
        Anonymous Coward

        Re: But shirley...

        "No, modern (post 1990s multi-user system) operating systems should manage the memory space for applications to prevent this."

        I didn't mean read the memory directly, its needs OS support to bypass the standard memory protection. But it can be done otherwise debuggers, trace and profiling programs wouldn't work.

        1. Anonymous Coward
          Anonymous Coward

          Re: But shirley...

          Correct, but usually there are privileges needed for such operations, privileges a "normal" user should not have. If the user for any reason has or can gain those privileges, the OS will comply and make the memory not only readable, but maybe writeable as well...

          1. Anonymous Coward
            Anonymous Coward

            Re: But shirley...

            "Correct, but usually there are privileges needed for such operations, privileges a "normal" user should not have"

            So all debuggers have to run with root privs?

            Back to school for you I think.

            1. Gideon 1

              Re: But shirley...

              "So all debuggers have to run with root privs?"

              No, the debugger can run the app as a child process, retaining the ability to peek into its innards.

              1. Anonymous Coward
                Anonymous Coward

                Re: But shirley...

                "No, the debugger can run the app as a child process, retaining the ability to peek into its innards."

                And if the debugger has to go into a live (already-running) application?

              2. Anonymous Coward
                Anonymous Coward

                Re: But shirley...

                "No, the debugger can run the app as a child process, retaining the ability to peek into its innards."

                They can also attach to running processes. At least on Unix, no idea about Windows.

                1. Michael Wojcik Silver badge

                  Re: But shirley...

                  They can also attach to running processes. At least on Unix, no idea about Windows.

                  Windows as well. The design of the Windows protection model for userland processes is different from the UNIX one, but the result is broadly a similar protection model.

                  Windows has a more thorough use of object ACLs so the access determination is more complex and nuanced than just "source uid == target uid == target euid", but to a first approximation it's the same sort of thing. Particularly when you compare it with the whole universe of commercial OSes, some of which are significantly different (e.g. System i) or very different (e.g. Orange Book A1 systems like SNS).

      2. Michael Wojcik Silver badge

        Re: But shirley...

        No, modern (post 1990s multi-user system) operating systems should manage the memory space for applications to prevent this.

        This is simply wrong. Take Linux, for example. From the ptrace(2) man page:

        EPERM The specified process cannot be traced. This could be because the parent has insufficient privileges (the required capability is CAP_SYS_PTRACE); non-root processes cannot trace processes that they cannot send signals to or those running set-user-ID/set-group-ID programs, for obvious reasons. Alternatively, the process may already be being traced, or be init (PID 1).

        Consider in particular the bit about "non-root processes". Processes with normal privileges (non-superuser, without CAP_SYS_PTRACE) can trace processes running with the same uid and euid. That includes reading and write process private memory.

        On Windows, similarly, a normal-privilege process can open a handle to another process running with the same security token, and through that handle manipulate process memory and even do things like creating threads in the target.

        Security models for multiuser operating systems typically impose access-control requirements at user and system granularity: that is, access controls must protect resources owned by a user from other users, and system resources from invalid access by user-mode code. That's essentially how the Orange Book (which came out in 1983, by the way - your "post 1990s" date is way off) defines the C2 level, for example.

    2. Michael Wojcik Silver badge

      Re: But shirley...

      ... if you have 2 processes running under the same user id on a system, then 1 process can attach to the other and scan its memory anyway

      That depends on the operating system. But I'll assume we're talking about UNIX-family OSes here.

      That's why the resource isolation model in iOS doesn't simply run apps as conventional UNIX processes under the same ID. There's more information in the paper, or elsewhere.

      Under Android, according to the paper, each app runs under a different UID. (I haven't bothered trying to confirm this from other sources.)

  11. TWB

    Worrying

    I have never used Keychain - I've never felt it was safe 'leaving' my passwords 'in the computer' even if they are encrypted and supposedly locked down somehow. Having said that, like most people my email password is stored within my mail client and I don't want to have to type it in every time I want to check for email.

    OSes eh? Is it going to be linux next time for me or give up the computer time wasting.

    1. Anonymous Coward
      Anonymous Coward

      Re: Worrying

      "I've never felt it was safe 'leaving' my passwords 'in the computer' "

      Err, where do you think passwords are normally stored on a home computer? Or do you think everything is stored on a remote server... sorry, I mean in Dah Cloud? Even the browser stores your online passwords in an encrypted cache.

      1. Irongut

        Re: Worrying @boltar

        On your computers maybe. But on mine, and presumably the op's, my browsers do not store any passwords encrypted or not.

        Some of us care about security rather than using potentially insecure features that save you a few seconds.

        1. Anonymous Coward
          Anonymous Coward

          Re: Worrying @boltar

          "On your computers maybe. But on mine, and presumably the op's, my browsers do not store any passwords encrypted or not."

          I'm assuming you have a password for the main login on your computer, which whether you like it or not will be stored on the local machine.

          And if for example you're using a web proxy server you enter the proxy password every time your browser needs to fetch a page do you? And you manually type in every password for every website you use?

          It makes sense not to cache banking passwords, but crap like social media, who cares?

          1. sd123

            Re: Worrying @boltar

            Personally I'd hope that my computer just stores a salted hash of my password note the password itself :-)

            1. Anonymous Coward
              Anonymous Coward

              Re: Worrying @boltar

              "Personally I'd hope that my computer just stores a salted hash of my password note the password itself :-)"

              The OS yes, not browsers , they store the actual encrypted password otherwise they wouldn't be able to auto complete password fields.

              1. Michael Wojcik Silver badge

                Re: Worrying @boltar

                "Personally I'd hope that my computer just stores a salted hash of my password note the password itself :-)"

                The OS yes, not browsers , they store the actual encrypted password otherwise they wouldn't be able to auto complete password fields.

                That was rather the OP's point. The OS can store a non-reversible password verifier (a hash, a ZKP verifier as with SRP or PAK-RY, etc). The browser needs to store a reversible encrypted password. So not using the browser's autocomplete feature removes a significant branch of the attack tree, and your post about the OS "storing the main password" is irrelevant.

                On the other hand, not using the browser's autocomplete feature or other "password safe" technologies means the user types the password more frequently, increasing the attack surface for e.g. keyloggers and some forms of phishing attacks. It's a trade-off. Personally, I don't use a password safe and disable many other sorts of credential caching, as that's the less risky option under my threat model. It means I type my (38-character) Windows domain password half a dozen times each day, but I'm a fast typist.

    2. Anonymous Coward
      Anonymous Coward

      Re: Worrying

      I have never used Keychain

      If you use OSX you have, but probably via applications. Just open Keychain and search, for instance, for your email address.

      1. TWB

        Re: Worrying @Boltar and @AC

        Well I feel a bit of a dunce now - but better informed (so thanks in a way), I had a look in Keychain (never opened it before) and saw that there are some passwords there. I did not realise that for some systems (typically Apple stuff) it stored them there whether you liked it or not. When the pop-up had appeared in the past 'Do you want Keychain to store this password' - I had always said no.

        I guess my browsers store some passwords - I do occasionally reset them clearing certain logons but I type in passwords for things like online stores and banking when I need them - don't know how risky it is being logged into the register often - I tend to have different passwords for everything.

        Being a miserable git (or whatever term takes your fancy) I've not really got into the social media stuff

        1. Adam 1

          Re: Worrying @Boltar and @AC

          >don't know how risky it is being logged into the register often

          As long as your credentials are sent over https, it should be fine. Oh wait..

  12. Terje

    I feel that there are basically two main options for why this has not been fixed yet.

    option 1.

    We are apple, we don't care.

    option 2.

    The issue is a design flaw more then an implementation flaw and thus they have no clue of how to plug it without breaking everything or doing big rewrites of loads of core components, thus involving tons of management at all levels of the company and burying the "project" in glue.

    1. Anonymous Coward
      Anonymous Coward

      "option 1.

      We are apple, we don't care." its more like this " We are apple, we all know your too stupid to hold the iphone correctly so these serious issues are beyond your understanding- don't worry, its an android problem. "

      1. Phuq Witt
        Facepalm

        Bingo!

        "...We are apple, we all know your too stupid..."

        Back of the net!

    2. Len

      At first sight this looks like an issue that is buried very deep and could require a considerable overhaul of the underlying system. Not a question of checking some bounds or adding an escape character.

      As that could take fairly long to fix they've probably already separately looked at the attack vector. At least that reduces the risk until the developers have fixed the underlying problem. It seems that the malware would have to come in via the App Store, otherwise why would these researchers have gone down that route? If the App Store is the only attack vector and you know what a malicious app needs to do to gain access you can look for it in the app vetting process.

      It's not perfect but it's better than nothing.

      1. Terje

        I agree with what you say, though from what I understood without doing any additional research shouldn't it be just as plausible to exploit any security vulnerabilities allowing code execution (even within the sandbox) of any installed app to be able to get at this?

    3. Anonymous Coward
      Anonymous Coward

      Reading the paper, the vulns are in the app sandboxing, IPC and WebSocket design - thereby well deep into the OS - any change may impact a lot of application that could stop working - it could really take not a little time to understand how to fix it without big braking changes.

      That's why those who believe you can always fix a vuln in a few weeks are those who never worked on a complex piece of software, with a lot of other software beyond your control depending on it. And a lot of customers who would become really angry if you broke something badly.

      1. Charlie Clark Silver badge

        That's why those who believe you can always fix a vuln in a few weeks are those who never worked on a complex piece of software, with a lot of other software beyond your control depending on it.

        It's primarily a design issue that should have been picked up a long time ago. How do you think the liability should be handled if someone experiences harm as a result? Disclosure isn't really any different to finding defects in laptop batteries, or car accelerator pedals.

  13. jason 7
    Devil

    But but...but...

    ...this isn't mentioned on the BBC website...so it can't be true!

    Most odd for them to not mention something Apple related...

    1. Blane Bramble

      Re: But but...but...

      I heard Stephen Fry is working with Apple to fix this at this very moment.

  14. Phil Endecott

    Keychain on iOS is secure

    Having just read the PDF -

    - The keychain on iOS is not affected.

    - The only thing on iOS that is affected is URL schemes. This has been known forever; anyone can publish an app which claims any URL scheme, so you shouldn't send anything sensitive using them.

    OSX has more holes....

    1. Charlie Clark Silver badge

      Re: Keychain on iOS is secure

      OSX has different holes....

      FTFY…

  15. chivo243 Silver badge

    Waiting game?

    I heard that El Capitan makes these issues moot in OS X, but how many people will be screwed in the meantime?

    1. tempemeaty
      Mushroom

      Re: Waiting game?

      I imagine a lot will feel the screw. Many of us CAN'T upgrade to the next version of OSX every time they release one. It breaks our existing programs to often. Add to that Apples' POLICY of refusing to patch ANY past versions once another is released......we are just screwed. This is the "Apple Screw™" and how it turns...

      1. chivo243 Silver badge
        Coat

        Re: Waiting game?

        I feel your pain... I have a wonderful G5 PPC that still does what I want, but will never see another patch or update. Long live 10.5.8!

        I find drinking the kool-aid in small sips makes it go down easier.... MMMMMmmmmm Apple flavored!

        1. Frank Bough

          Re: Waiting game?

          You could save a lot of leccy with an Intel machine.

      2. chris 17 Silver badge

        Re: Waiting game?

        @ tempemeaty

        my 2008 13" macbook is running the latest Mac OS Yosemite, how far back do you expect them to goto? PowerPC support was dropped sometime ago around the same time as 32 bit intel chips. if your machine can't support Yosemite its time to upgrade hardware.

        1. cyke1

          Re: Waiting game?

          From what i seen, Apple has no problem after 2 years doing it. that is generally how old OS is when its cut off completely from updates. PPC i think it got 1 new OS version after switched to intel after that is was dead.

        2. P. Lee

          Re: Waiting game?

          The problem is not OS support, its Apple's insistence on selling low-spec'ed systems which can't be upgraded. The OS may run fine, but iTunes? Mail?

      3. Frank Bough

        Re: Waiting game?

        Update = patch

  16. Stevie

    Bah!

    Two things:

    a) Buggery buggering bugger!

    2) If you lot are going to nitpick off-topic minutiae, could you please either not do it from a phone or if you must use a phone, turn off its auto-correct? Some of the posts here have been auto-corrected into complete garage.

    1. Anonymous Coward
      Anonymous Coward

      Re: Bah!

      auto-corrected into complete garage.

      I saw what you did there :)

  17. Nanners

    You need security

    luckiky we can sell it to you. Or we can release these exploits to the Russians. It's your call... But we're not blackmailing...just letting you know.

  18. JLV
    Unhappy

    we've been here before

    Once again, Apple is caught with its pants down despite being based on a BSD OS. As a customer (most certainly not a fanboi), I find this very concerning.

    A vulnerability that allows peeking into a password manager? Can't get much worse than that, can we?

    They are after all a $700B+ company. Surely, locking down their systems entirely, and not pulling a "geez, hard to fix it, man", like they have been doing with rootpipe. Or doing a mid-90s Microsoft and claiming that integration and ease of use trumps security. For example, let me disable Keychain, at least for certain apps until this is fixed.

    Start with the premise that anything that involves authentication or authorization, and does not come from the BSD core needs to reviewed with the most extreme paranoia. Well, the BSD core too, but that's been out there longer.

    That would be worth a $1B or 2, surely. Lots of tasty bounties for example.

    Rather than splurging $3B on a maker of flashy headphones with questionable acoustics.

    Awww, what do I know? Just a dumb user.

    1. cyke1

      Re: we've been here before

      Seems to be a recurring trend with Apple, everything is a hard to fix problem. Guess that is how it ends up when only thing in your face is money.

      If you remember the celeb photo leak, yea Apple knew of the flaw that was used 6 months ahead of time, nothing was done to fix it til after the leak.

  19. Anonymous Coward
    Anonymous Coward

    1password considerations

    Take a look here.

    https://discussions.agilebits.com/discussion/comment/212590/#Comment_212590

    Note that their attack does not gain full access to your 1Password data, but only to those passwords being sent from the browser to 1Password Mini. In this sense, it is getting the same sort of information that a malicious browser extension might get, whether or not you use 1Password.

    I never felt very happy myself with password manager to browser integration and kept it turned off. So it seems I am OK for now.

    Note also that if you think, like some individual posters morons & fanbois at 9to5mac.com, that this whole thingy is overblown, having the vendor of a password manager confirm that they've worked on it and it is real should be a bit of a reality distortion field remover. If that is possible with deep down fanbois.

  20. MeganOBrien

    1Password's response

    Hi, I'm Megan and I work for AgileBits, the makers of 1Password. For our security expert's thoughts on this article, please see our blog: https://blog.agilebits.com/2015/06/17/1password-inter-process-communication-discussion/. If you have further questions, we'd love to hear your thoughts in our discussion forums: https://discussions.agilebits.com.

  21. Anonymous Coward
    Anonymous Coward

    Missing a major point?

    Despite the assertions from numerous pro-Apple commentators in these hallowed forums, it would appear the the much-vaunted screening of apps in the App Store isn't all it's claimed to be.

    This isn't the first time that researchers have had malicious apps published and made public, yet so many Apple apologists insist that there is no malware in the App Store, Apple weed it all out, you only get malware if you jailbreak, etc.

    Surely nobody with an ounce of common sense can really believe that, if security researchers can have malware published, that there are no apps on the App Store created by the criminal fraternity containing malware exploiting as yet undisclosed vulnerabilities.

    1. Anonymous Coward
      Anonymous Coward

      Re: Missing a major point?

      >>yet so many Apple apologists insist that there is no malware in the App Store, Apple weed it all out, you only get malware if you jailbreak, etc.

      You've misunderstood. Nobody thinks Apple's vetting process is anything special.

      The reason people say there's no malware in the App Store, and that you only get malware if you jailbreak, is because of Apple's sandboxing system, which has seemed pretty darned robust until just now.

      Apple fans aren't necessarily as stupid as you seem to assume.

      1. Robert Grant

        Re: Missing a major point?

        > Apple fans aren't necessarily as stupid as you seem to assume.

        Being a fan of a business you don't own isn't a great sign.

        1. Anonymous Coward
          Anonymous Coward

          Re: Missing a major point?

          >>Being a fan of a business you don't own isn't a great sign.

          It's a publicly traded company. I'm sure millions of people are owners, myself included.

      2. Michael Wojcik Silver badge

        Re: Missing a major point?

        The reason people say there's no malware in the App Store, and that you only get malware if you jailbreak, is because of Apple's sandboxing system, which has seemed pretty darned robust until just now.

        I dare say most of the people who say that have no idea the sandbox even exists, and are just repeating what they've heard.

        Certainly most of the people that I've heard claim the iOS App Store is free of malware are not software security experts, or even vaguely familiar with the field.

        1. Anonymous Coward
          Anonymous Coward

          Re: Missing a major point?

          >>I dare say most of the people who say that have no idea the sandbox even exists, and are just repeating what they've heard.

          That's fine, they are still correct by proxy.

  22. Andy3

    AAAW, and here's me thinking Apple stuff was made by genii and was perfect in every way. Ahem.

  23. wsm

    Another version in the offing

    If this truly is a fundamental flaw in the OS, it's going to take another version to fix it. We can expect to see new programmer training and new versions of most OSx compatible apps with the next variation.

    It's bound to happen. Write once, run only in the old version, write it again to run it in the new and improved secure version.

  24. azaks

    Wasted opportunity...

    To think what they could have achieved by disclosing this information irresponsibly.

    I've done some quick estimates - denial down 22%, self-righteousness down 38%, and smugness down a whopping 76%...

  25. ecofeco Silver badge

    This is where I ask my usual question

    How's that cloud thing working for ya?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like