When all you have is a Hammer, everything looks like a Fail.
OAuth 2.0 standard editor quits, takes name off spec
The lead author and editor of the OAuth 2.0 network authorization standard has stepped down from his role, withdrawn his name from the specification, and quit the working group, describing the current version of the spec as "the biggest professional disappointment of my career." Eran Hammer, who helped create the OAuth 1.0 …
-
-
-
-
Monday 30th July 2012 08:08 GMT Daniel Friesen
Re: Facebook OAuth security
As long as you're using the "Authorization Code Grant" what gets returned in the browser url is not an access token but a temporary authorization code. After the client gets that authorization code from the url it makes a request to the token endpoint using it's client credentials and in that request an access token is returned.
Exchanging an authorization code for an access token requires both the access token and the client credentials. Because a 3rd party does not know the client secret even if it gets ahold of the authorization code the code is completely useless to it. Additionally seconds later the real client makes a token request and the authorization code becomes worthless. And it would expire minutes later anyways.
There are issues with OAuth 2, but this part of the theory of auth is soundly done.
-
Monday 30th July 2012 19:15 GMT Anonymous Coward
Re: Facebook OAuth security
That grant is not part of Facebook's set up. A fully active short life token is issued and then exchanged for a long life token. When you authorise an app through the browser it passes the active access token in the url. That token can be used to post spam. Suggest you search on Facebook Bomb Like - which will show you just how its done.
-
-
-
-
-
Saturday 28th July 2012 13:40 GMT Anonymous Coward
Easy to implement isn't always good
The line "[...] that OAuth 2.0 is much easier to implement than OAuth 1.0," doesn't mean as much at it might seem at first. Yes, it may be easier to implement something that ticks all the boxes to be OAuth 2.0, but does that mean that it will interoperate with any other implementations of OAuth 2.0? From what I read in Hammer's post, the issue is that there are so many area in the OA2 spec that aren't nailed down - that are up to the implementor to choose - that you can have 2 fully compliant implementations and they won't talk to one another because of a myriad of differences in the choices made by the implementors.
I remember a story about POSIX, which likewise has lots of places where an implementor can make choices about what to implement and what to return a not implemented error. One comment was that it was possible to create an OS that was fully POSIX compliant yet was totally unable to run any programs within the POSIX API, "but who would be crazy enough to do that." Then came the POSIX layer for Windows NT....
-
Monday 30th July 2012 07:18 GMT S Watts
Re: Easy to implement isn't always good
"""Then came the POSIX layer for Windows NT...."""
...allowing it to comply with US Gov (DOD?) tender for a POSIX compliant (level not specified) OS, undercutting the UNIX iron-mongers, and thus forcing a shambolic and archaic operating system on the end users.
-
Sunday 29th July 2012 07:51 GMT Yes Me
'"I honestly don't know what use cases OAuth 2.0 is trying to solve any more," Hammer says.'
He could always try reading the draft about that:
http://datatracker.ietf.org/doc/draft-ietf-oauth-use-cases
What is missing from the set of documents (and from the WG charter) is an analysis of why OAuth 1 is unsatisfactory. It's very common for the 2nd generation of a protocol to be much more complex than its predecessor, but it would be nice to know why.
Anyway ,as usual, the IETF will propose, but the market will decide.
-
Sunday 29th July 2012 08:29 GMT Anonymous Coward
re: the IETF will propose, but the market will decide.
I assume by this what you really mean is: The IETF will propose, but all major commercial companies will just implement whatever the hell they feel like for years before anything resembling a coherent and finished draft appears - by which time vendors will simply argue over who gets to 'win' in terms of specific implementations regardless of what the finished draft says.
Fun times are a-coming.
-
Monday 30th July 2012 20:27 GMT asdf
Re: re: the IETF will propose, but the market will decide.
> for years before anything resembling a coherent and finished draft appears - by which time vendors will simply argue over who gets to 'win' in terms of specific implementations regardless of what the finished draft says.
Wow so sounds like every other major networking protocol/technology then. v.90 anyone?
-
-
-
Tuesday 31st July 2012 15:38 GMT wildwoodweed
making the spec correct
The idea that the "web culture" is pure and/or superior is misguided. "Making the spec correct" sounds nice and all, but it is a meaningless statement. Correct for whom and what use cases and under which constraints? Like everything else, specs are subjective. People who don't understand this are dangerous and likely to engage in non-productive activities such as fitting square pegs into round holes and engaging in technology jihads.
A good spec solves "the right" use cases for "a broad enough" community, being "strict enough" to promote interoperability while simultaneously providing "enough" extensibility to allow for evolution. All of the key bits of that last statement involve some sort of subjective judgement. You could say that the judgement of someone from the "web community" was likely to be "cleaner" that that of someone in the "enterprise community" in that it isn't subject to marketing and product pressures, but "cleaner" does not necessarily equate to "better" in all cases.