The goal is to use an adaptive authentication strategy, and depending on the context, you figure out if you need to re-authenticate the person.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
The problem is the architecture, not the authentication technology
A recent Information Week article raises some excellent points about application security. However, I have a few issues with the article
The writer missed the most newsworthy event: the impending approval of OpenID Connect, which is a profile of OAuth2, which will probably replace SAML. I don’t even think Microsoft is recommending you expand your WS- deployment right now, because even they probably don’t want to support it in the future.
Also, the writer seems to posit that “SSO” is the silver bullet. I think this is too low of a target. What we really need is inter-domain trust elevation. The goal is to use an adaptive authentication strategy, and depending on the context, you figure out if you need to re-authenticate the person.
I agree with the author that in order to build trust, you need both tools and rules. However, I would argue that the tools have been somewhat to blame. As the author points out, the current environments for domain security are heterogeneous. Organizations are supporting multiple authentication single sign on interfaces, including Kerberos, LDAP, SAML, OAuth2, RADIUS and the various 2FA proprietary protocols (ACE Server, Duo API, Toopher API. etc).
To control access to various internal and SaaS applications. Without technical standards, the people who write the rules have a hard time detecting the signal from the noise.
As the impending standardization of OpenID Connect demonstrates (add OAuth2 to the list above… latest and greatest JSON/REST APIs), these tools have only recently become available. In fact, the tools are still evolving. UMA has not yet become an IETF standard, although adoption does look promising, as several vendors have committed to interop testing.
Authentication and application security is inherently an inter-domain problem. Organizations need to think about both inbound authentications from their partners, and outbound authentications to SaaS and internal applications. My advice to companies is somewhat pragmatic: evaluate each application on a case by case basis… and move new applications to the current technology (don’t make your future upgrade job bigger).
My other piece of advice is to form a multi-party federation in your ecosystem. This can help you align both the technical and the legal details, to enable you to drive down the cost for inter-domain trust between suppliers, customers, and partners.
Article resource – http://thegluuserver.wordpress.com/2014/05/16/how-to-benchmark-ox-for-a-large-scale-deployment/