back to article OAuth 2.0 standard editor quits, takes name off spec

The lead author and editor of the OAuth 2.0 network authorization standard has stepped down from his role, withdrawn his name from the specification, and quit the working group, describing the current version of the spec as "the biggest professional disappointment of my career." Eran Hammer, who helped create the OAuth 1.0 …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    When all you have is a Hammer, everything looks like a Fail.

  2. Anonymous Coward
    Anonymous Coward

    Facebook OAuth security

    Is already compromised when it comes to applications as the authorisation handshake passes application tokens in the query part of the URL. This allows it tl be intercepted and used by other rogue apps to post spam messages.

    1. Anonymous Coward
      Anonymous Coward

      Re: Facebook OAuth security

      Not under ssl

      1. Anonymous Coward
        Anonymous Coward

        Re: Facebook OAuth security

        If the authentication is being done in a browser window then FB passes back a url which contains an extractable valid access token.

        1. Daniel Friesen

          Re: Facebook OAuth security

          As long as you're using the "Authorization Code Grant" what gets returned in the browser url is not an access token but a temporary authorization code. After the client gets that authorization code from the url it makes a request to the token endpoint using it's client credentials and in that request an access token is returned.

          Exchanging an authorization code for an access token requires both the access token and the client credentials. Because a 3rd party does not know the client secret even if it gets ahold of the authorization code the code is completely useless to it. Additionally seconds later the real client makes a token request and the authorization code becomes worthless. And it would expire minutes later anyways.

          There are issues with OAuth 2, but this part of the theory of auth is soundly done.

          1. Anonymous Coward
            Anonymous Coward

            Re: Facebook OAuth security

            That grant is not part of Facebook's set up. A fully active short life token is issued and then exchanged for a long life token. When you authorise an app through the browser it passes the active access token in the url. That token can be used to post spam. Suggest you search on Facebook Bomb Like - which will show you just how its done.

  3. John Smith 19 Gold badge
    Unhappy

    At last. A protocol that supports "Enfold, extend, extinguish" from day 1

    How exciting is that

  4. Anonymous Coward
    Anonymous Coward

    Aleluia

    Now if only the rest of the WS- gang would drop off the same way...

  5. Anonymous Coward
    Anonymous Coward

    Easy to implement isn't always good

    The line "[...] that OAuth 2.0 is much easier to implement than OAuth 1.0," doesn't mean as much at it might seem at first. Yes, it may be easier to implement something that ticks all the boxes to be OAuth 2.0, but does that mean that it will interoperate with any other implementations of OAuth 2.0? From what I read in Hammer's post, the issue is that there are so many area in the OA2 spec that aren't nailed down - that are up to the implementor to choose - that you can have 2 fully compliant implementations and they won't talk to one another because of a myriad of differences in the choices made by the implementors.

    I remember a story about POSIX, which likewise has lots of places where an implementor can make choices about what to implement and what to return a not implemented error. One comment was that it was possible to create an OS that was fully POSIX compliant yet was totally unable to run any programs within the POSIX API, "but who would be crazy enough to do that." Then came the POSIX layer for Windows NT....

    1. James 47

      Re: Easy to implement isn't always good

      In fairness, bolting on a layer on top of an existing one is quite difficult. Symbian has a Posix layer too and there are some concepts that just do not translate across.

    2. S Watts
      FAIL

      Re: Easy to implement isn't always good

      """Then came the POSIX layer for Windows NT...."""

      ...allowing it to comply with US Gov (DOD?) tender for a POSIX compliant (level not specified) OS, undercutting the UNIX iron-mongers, and thus forcing a shambolic and archaic operating system on the end users.

  6. Yes Me Silver badge

    '"I honestly don't know what use cases OAuth 2.0 is trying to solve any more," Hammer says.'

    He could always try reading the draft about that:

    http://datatracker.ietf.org/doc/draft-ietf-oauth-use-cases

    What is missing from the set of documents (and from the WG charter) is an analysis of why OAuth 1 is unsatisfactory. It's very common for the 2nd generation of a protocol to be much more complex than its predecessor, but it would be nice to know why.

    Anyway ,as usual, the IETF will propose, but the market will decide.

    1. Anonymous Coward
      Anonymous Coward

      re: the IETF will propose, but the market will decide.

      I assume by this what you really mean is: The IETF will propose, but all major commercial companies will just implement whatever the hell they feel like for years before anything resembling a coherent and finished draft appears - by which time vendors will simply argue over who gets to 'win' in terms of specific implementations regardless of what the finished draft says.

      Fun times are a-coming.

      1. asdf

        Re: re: the IETF will propose, but the market will decide.

        > for years before anything resembling a coherent and finished draft appears - by which time vendors will simply argue over who gets to 'win' in terms of specific implementations regardless of what the finished draft says.

        Wow so sounds like every other major networking protocol/technology then. v.90 anyone?

  7. wildwoodweed

    making the spec correct

    The idea that the "web culture" is pure and/or superior is misguided. "Making the spec correct" sounds nice and all, but it is a meaningless statement. Correct for whom and what use cases and under which constraints? Like everything else, specs are subjective. People who don't understand this are dangerous and likely to engage in non-productive activities such as fitting square pegs into round holes and engaging in technology jihads.

    A good spec solves "the right" use cases for "a broad enough" community, being "strict enough" to promote interoperability while simultaneously providing "enough" extensibility to allow for evolution. All of the key bits of that last statement involve some sort of subjective judgement. You could say that the judgement of someone from the "web community" was likely to be "cleaner" that that of someone in the "enterprise community" in that it isn't subject to marketing and product pressures, but "cleaner" does not necessarily equate to "better" in all cases.

This topic is closed for new posts.

Other stories you might like