Why (re-)learn Authentication and Authorization? – DEVELOPPARADISE
17/07/2018

Why (re-)learn Authentication and Authorization?

“What happens with smaller businesses is that they give in to the misconception that their site is secure because the system administrator deployed standard security products – firewalls, intrusion detection systems, or stronger authentication devices such as time-based tokens or biometric smart cards. But those things can be exploited.”
Kevin Mitnick

Authentication and Authorization have been integral parts of every system I worked on, as far as I can remember. Things have changed over time, though. Once there was only a table with usernames and passwords, and the stand alone application would have a login screen displayed at the begin of a session. The application was responsible for checking if the password was correct for the username, and for restricting access to resources based on who the user was. There was a session in place and a logged in user was so granted we did not have to worry about it. The user data was always available.

So many scenarios came up after those ancient cyber security times, that more and more problems and solutions would come up every year. First, there was the problem of many different systems with separate security components and credentials. So many passwords, that users where overwhelmed just by having to maintain the long password list up to date and secure.

Single Sign On (SSO) emerged as the obvious solution for all login related problems. One single system would control authentication and authorization for all applications in the enterprise. Then came the web and the mobile applications. The number of passwords at the company got compounded with the ones for personal applications, and it was credentials management nightmare all over again.

On top of that, new distributed architectures were devised to take advantage of the ubiquity of the Internet. Web services and stateless protocols became the norm, and the authentication and authorization nightmare migrated from the users’ turf to the developers’.

I hope it is obvious to everyone that with these many new cool and flexible features, the number of vulnerabilities skyrocketed. It was not just a matter of downloading a virus or someone guessing your weak password. Modern applications are a mesh of interconnected components, and each one can potentially be hacked by anyone, anywhere on the planet.

Of course, many smart people have been worrying about these issues for a long time. Every time a security problem emerges, an army of bright minds try to find a clever fix. Fortunately, collaboration among these solution finders have also increased exponentially. Big companies concerned with web security joined forces and struggled to standardize the security approaches for this new generation of applications.

The Open Web Application Security Project (OWASP, www.owasp.org) was created in 2001, and publishes multiple guides for security on the web, and an annual Top Ten Web Applications Vulnerabilities list that is worth checking and rechecking from time to time.

In terms of Authentication and Authorization, three organizations are worth mentioning: the Internet Engineering Task Force (IETF, https://www.ietf.org), the OpenID Foundation (OIDF, https://openid.net/) and the Fast Identity Online Alliance (FIDO Alliance, https://fidoalliance.org/).

IETF has been working on web authorization for a long time. In 2008 it decided to bring the web authorization works of Blaine Cook, former lead developer of Twitter, to standardization. OAuth 1.0, the open protocol standard for authorization and access delegation, was initially published in 2010. By the end of 2012, a new OAuth 2.0 version was published. It was since then supported and implemented by many giants of the web, like Google, Microsoft, and Amazon.

OIDF was created in 2007 and since then focused on creating an open standard for web authentication. It published the initial version of the OpenID standard that same year, and OpenID 2.0 in 2008. Many companies were endorsing OpenID, but in 2013 Facebook left the foundation adopting its own proprietary protocol, Facebook Connect, alleging OpenID was not supportive of modern development techniques, especially Representational State Transfer (REST) web services. OpenID addressed these concerns in 2014 by releasing OpenID Connect, a new generation of OpenID technology that works as an authentication layer on top of OAuth 2.0. This new specification is extensible, supporting optional features such as encryption of identity data, discovery of OpenID providers, and session management.

FIDO Alliance is the new cool kid on the block. Created in 2013, it focuses on standardizing different authentication technologies, including biometrics (like fingerprint readers and iris scanners), security tokens, Bluetooth, and near field communication (NFC: some devices, like smartphones, can communicate with each other when they are put in close proximity, usually about an inch.)

OAuth 2.0 and OpenID Connect are of particular importance for web developers, as together they address authentication and authorization in many different web scenarios, including the use of third party Authentication providers, like Google, LinkedIn, or Microsoft, the use on mobile applications, and the secure use of RESTful web services, both when the application is the only client for the web service, as well as when the application is making requests on behalf of a logged in user. In future posts I intend to address the implementation of OAuth 2.0 and OpenID Connect in Java Spring and ASP.Net Core applications.

As you can see, all new standards are less than five years old at the time of this post. It also takes some time after a standard is published to have libraries implementations available for use. The use case scenarios and workflows are sometimes complex and require care to avoid vulnerabilities when architecting a system. Keep posted as I will explain OAuth 2.0 and OpenID Connect in more detail in the near future.