back to article Medical data, staff creds exposed as scores of apps bork the backend

And still we fail to learn: a quintet of researchers has found that the bad practice of writing keys into code persists among some of the world's most popular Android and iOS applications. The researchers say the hard-coded credentials can be easily extracted to gain access and manipulate millions of sensitive individual and …

  1. Anonymous Coward
    Anonymous Coward

    News flash, programmers suck at security

    This is why every modern OS provides good APIs that (mostly) do it right. App developers that fail to take advantage of that and roll their own do a worse job 99.99% of the time.

  2. channel extended

    Let's be AGILE, FUTILE, and PURILE

    Face it crap software is still crap.

  3. Paul Frankheimer

    Misleading article

    After reading the Register article and the researcher's PDF, I have to say that the article here doesn't really explain the problem correctly.

    The actual problem is that the apps don't use any keys apart from the keys needed to authenticate the app developer to the Backend platform. This key has to be saved somewhere in the application and that's OK. The individual *users* should however also have a specific key so that they are also authenticated with respect to the backend. And for the sample shown, there was no security at all at that level.

    1. Robert Helpmann??
      Childcatcher

      Re: Misleading article

      The individual *users* should however also have a specific key so that they are also authenticated with respect to the backend.

      Well, yes, but as the researchers point out, the users are pretty much at the mercy of the developers in this respect and that the developers are only putting in enough effort to get the app talking to the back end. At no point in these flawed apps would I expect the people using the apps to have an opportunity to set up their own keys. To go into a little more detail, the article states:

      By default, most BaaS solutions require an application only to authenticate using an ID that uniquely identifies the app, and a so-called "secret" key, used to indicate that the app uses the ID legitimately. These credentials, however, neither authenticate a device nor a user. They merely authenticate the app as such and are therefore shared between all installations of this app.

      So it looks as though it is not only the devs basing their apps on the BaaS solutions that fail to practice good security, but those that offer the BaaS solutions as well. And so the dominoes fall.

  4. Paul Frankheimer

    Re: Misleading article

    At no point in these flawed apps would I expect the people using the apps to have an opportunity to set up their own keys.

    Of course that should not be something that a user has to do himself, it should be done by the app on the *user's phone*. However it should be a key unique to that user's phone (or maybe account) which is generated when the user installs the app and signs up for the service or links the app to the service.

    You're quoting the right paragraph. However the conclusion is not entirely correct. Basically the BaaS services are providing a service to the app developer. The app developer has an ID and a key which he can use to access his service on the BaaS. The app developer is then responsible for developing an app which doesn't allow app user A to access app user B's account and data. The BaaS provider (e.g. Amazon AWS) doesn't give the app developer a user management solution. If they did and that was broken, the problem would be much bigger than the quoted 0,5% of apps.

  5. Anonymous Coward
    Anonymous Coward

    Vulnerabilities

    Too much typing, not enough thinking and delete key.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like