back to article Hardcoded god-mode code found in RSA 2016 badge-scanning app

The official RSA phone app exhibitors use to scan delegate badges contains a hardcoded password allowing the vendors to access the full features of the device, says Bluebox Security's Andrew Blaich. Vendors at the San Francisco mega-conference expo hall were handed Android Samsung Galaxy S4 phones locked into kiosk mode and …

  1. bazza Silver badge
    FAIL

    Whooopsie!

    There's a lesson for us all, again...

    Once again Douglas Adams is proved right. Always know where your towel is.

    1. Anonymous Coward
      Anonymous Coward

      Re: Whooopsie!

      That's not really a new discovery. I know for a fact that the people that run the annual counterterror thing in London have been on the wrong side of Data Protection laws as well (with me, so that wasn't hard to unearth).

      The reason for that is simple: these people organise events, not security, and are thus far more focused on getting the logistics of such a show together than the technical aspects of data collection. However, if you start running shows for very specific sectors (as they do), you will have to accept that some of the higher margins of this specialisation should be spent on doing things right instead of bullshit manoeuvres such as posting some overweight security guards.

      As for me, it proved once again that offering an alias by default is not a bad tactic :)

    2. NoneSuch Silver badge
      FAIL

      "Just because an app is being used for one of the world's largest cyber security conferences doesn't automatically mean it's more secure."

      And banking, and VPN, and e-commerce, and medical info, and email, and personal data, and everything else under the sun. Putting these things in the Cloud is unnecessary and short sighted. If you don't control the infrastructure your info is on, you are putting it at risk.

    3. asdf

      waiting for other shoe to drop

      Cue DMCA cease and desist from the embarrassed company in 3 .. 2 .. 1. Especially if this guy ends up giving a talk.

  2. Yag
    Facepalm

    ...

    In this case, a single picture is worth a thousand words.

  3. Voland's right hand Silver badge

    The app was developed by?

    Well... well... well... Why does not this surprise me.

    Elliptic ciphers and broken RNGs anyone?

  4. Doctor Syntax Silver badge

    "Just because an app is being used for one of the world's largest cyber security conferences doesn't automatically mean it's more secure."

    In fact, it's more likely to be found insecure.

    1. Anonymous Coward
      Anonymous Coward

      More likely to be found out

      as more people will be trying to break it as it is a world leader...

      Maybe a device attribute e.g. serial number that was unique and not embedded in the code would have been a marginally better approach. At leas that could not be lifted right from the code and used universally.

      However leaving hard coded developer credential in a security application seems rather unforgivable, particularly given the high privilege it requires.

      1. Tomato42
        Boffin

        Re: More likely to be found out

        this problem was solved years ago: don't store password in plaintext, store it after hashing; preferably something standard like scrypt or PBKDF2 with large amount of rounds

        1. Crazy Operations Guy

          Re: More likely to be found out

          They don't need to store a password anyway. IF you forget the password, they should just return the scanner to the conference host's support desk and receive a fresh one with a new password. Its not like these are out in the field, they're being used in a controlled conference venue loaded with support folk from the host's org.

          The scanning devices aren't doing all that much anyway, just a simple device that scans for an attendee's ID for the conference and adds it to the scanner's list of contacts. At the end of the conference. each vendor would send their list of gathered ID's and the host would send back the contact info for those IDs.

          A few years back, I was at a conference for hardware engineers and they gave everyone a badge scanner built out of an RFID dev kit with a WiFi module attached. The RFID module would send the 64-bit badge ID to a microprocessor. This would then be concatenated with "INSERT INTO" <Vendor_id> "(contact_ID) VALUES "<badge_id> ";" where the the WiFi module would take that string, dump it into an encrypted TCP packet and fire it off over an encrypted WLAN to an SQL server. At the end of the show, the vendors were sent back home with the scanning equipment and schematics+source of the device, it was just a couple dev kits put together after all. The attendees also received the dev kits and the code to re-program the ID number in their badges.

          After the conference ended, the host scanned the databases for non-existant badge-IDs to weed out vendors that tried to guess badge numbers to get additional contact info. The db hosting the contact info was isolated from the db tracking contacts, so there was no risk of hacking the badge scanner network to get at the private info.

  5. Boothy

    I'd have thought at this conference in particular, that the developer should have expected someone to try to crack the app.

    They should have written it better, and then perhaps left some sort of Easter egg in there for people to find.

    1. Cynical Observer
      FAIL

      To be fair, the developer might not have known that it was about to be used at the RSA conference. Mind you - that only reduces the intensity of the slapping that they still deserve.

      A slapping is due to RSA - they know the calibre of the delegates that their conference attracts - and unless they are leaving these festering piles as the equivalent of honey pots so that they can reference them in some concluding keynote (Hey! Never underestimate a company's ability to be devious) then there should have been a team looking at ways that the conference infrastructure might be compromised. - It's RSA's reputation that will suffer more than that of the app developer.

    2. MrWibble
      Devil

      Maybe it was a test to ensure the "right" people were at the conference?

  6. Pascal Monett Silver badge

    the security gaffe is "par for the course" in app development

    Well that's reassuring.

    NOT.

  7. allthecoolshortnamesweretaken

    Well, the conference is all about improving knowledge on security, so technically mission accomplished... But seriously, hardcoded? Now what does that remind me of...

    Even if the hardcoded password was meant as a fall-back mechanism in the event the administrator custom passcode is lost - that's still a bad idea which might not even work. You might just as well lose the Post-It with the hardcoded password on it as you can lose the admin custom passcode. Sloppy coding as a result of very sloppy thinking.

  8. Simon Harris

    "it's usually a best practice to not leave a hardcoded password in your code”

    Surely that should read...

    "it's usually a best practice to not put a hardcoded password in your code in the first place”

    Put a 'debugging feature' in during development and by release time chances are someone will have forgotten it's still there.

    1. MiguelC Silver badge

      Re: "it's usually a best practice to not leave a hardcoded password in your code”

      Can't decide if was he being passive-aggressive or just making a huge understatement...

  9. Tom 7

    Debugging feature added during development

    and the removal of the same (and the 6 months testing that went with that) signed off by a manager.

  10. Ben Liddicott

    We have to stop thinking these things are accidents

    Really, why does anyone think this is not on purpose?

    1. Captain DaFt

      Re: We have to stop thinking these things are accidents

      "Really, why does anyone think this is not on purpose?"

      Simple, people are prone to doing idiotic things.

      Paradoxically, the brighter the person, the more idiotic the gaffe they make.

  11. Anonymous Coward
    Meh

    It's not a public app and it doesn't hold public data...

    > Vendors of the San Francisco mega-conference expo hall were handed Android Samsung Galaxy S4 phones, locked into kiosk mode

    So the hardware is provided to those who rented a stand at the show. With it, they scan the badges of attendees who stop by said stand, and at the end of the day or conference, the organizers send them a list of those visitor's details.

    So the only data on the phone is a list of badge numbers which are sent to the organizers?

    And this is insecure? Meh.

  12. g00se
    WTF?

    Bigger fail

    The password found within the app's code allowed the hackers to access the app’s settings. From there they tapped into the phone's system settings ...

    Worse than leaving a hard-coded password in the app, why on earth should breaking in to application-level code be able to lead on to (what should be) OS-level access?

    1. Simon_E

      Re: Bigger fail

      Because it lets it out of kiosk mode? Which turns it back into an ordinary rootable phone?

      No?

      1. g00se

        Re: Bigger fail

        You're probably right!

  13. Christian Berger

    They install apps from known bad sources on their telephones?

    I mean seriously, RSA can be considered as a known bad source by now, but even installing any additional software from a source you don't _fully_ trust essentially violates the integrity of your device.

    Yes some mobile operating systems claim to provide sandboxes your programs are separated in, but new developments like Rowhammer or Cachebleed show that sandboxes can only save you from simple mistakes, but not from a determined attacker. I would expect everyone at a computer security conference to have reached that point of insight.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like