July 22: Talks 22.07.2020, 14:00 - 20:30

The second day starts with the official opening of the workshop, followed by two tracks of talks with eight talk slots each. Each talk is 25 minutes.

Time Table




Profiling OAuth 2.0 and OpenID Connect for Enterprise Use
We will describe MITRE's efforts to profile OAuth 2.0 and OpenID Connect to enable the ability to use them in a secure and interoperable manner to address enterprise environment use cases. Our profiles leverage protocol extensions, profiles, security guidance, and other work by the IETF OAuth Working Group and OpenID Foundation FAPI and iGov Working Groups. We will compare our profiles with other efforts and describe open challenges that remain. Our targeted enterprise use cases include: • user authorization delegation to a web application (using OAuth 2.0) • user authorization delegation to a native application (using OAuth 2.0) • user authentication to a web application (using OpenID Connect) The OAuth 2.0 and OpenID Connect standards are used ubiquitously across the Internet for delegated authorization and federated authentication. In an enterprise environment, OAuth and OpenID Connect can enable significant improvements over legacy approaches, for example by: • Abstracting user and device authentication away from individual web applications and native applications, providing the ability to adapt authentication approaches without modifying every existing application, and to provide single sign on. • Eliminating the need for applications to fully impersonate user identities when interacting with resource servers. • Enabling the ability to make authorization decisions based on attributes of both the user and the client, rather than just the user. However, the base specifications alone are insufficient for enterprise adoption due to numerous optional requirements, undefined behaviors, and issues that have been identified since their publication, hindering security and interoperability. We will also describe our efforts to profile the IETF RFC 8693 OAuth 2.0 Token Exchange specification to enable protected resources to access other protected resources in order to satisfy a query received from a client, including addressing complexities such as multi-organization environments. We would like to explore opportunities to collaborate with others facing similar enterprise challenges and to consolidate our work with related efforts.
Michael Peck,Mark Russell
Room: Plenary
app2app oauth2
Several OAuth2 ecosystems (the largest of which is likely the UK OpenBanking ecosystem) have recently started using a process known as 'app2app' to do OAuth2 authorization. This allows a better user experience on mobile (iOS/Android devices) as user authorization/consent can be done in the user’s existing first party app and can leverage biometrics and other methods that may not be possible or as easy in a web based experience. It is very similar to the ‘Claimed "https" Scheme URI Redirection’ mechanism applied to the client’s redirect uri in the IETF ‘OAuth 2.0 for Native Apps’ BCP (now colloquially known as ‘app2web’), but instead applied to the authorization server’s authorization endpoint. One way of viewing this is that a first party native app becomes the ‘user-agent’ referred to in RFC6749. Similar to the existing BCP, it gracefully degrades into ‘app2web’ and ‘web2app’ variants. The expanded remit, and some related issues it brings (in particular there has been a trend of authorization servers publishing alternative authorizations endpoints via out-of-band mechanisms) are worthy of further examination. It may need additional guidance for authorization server vendors on providing secure APIs to allow native apps to process authorization requests and creating the authorization endpoint response ready to return to the OAuth2 client. To some extent these APIs might look very similar to the existing authorization endpoints or endpoints that the AS exposes to the webapp it provides on the authorization endpoint.
Joseph Heenan
Room: ROOM 2




The Road to OpenID Connect
Providing services to customers in multiple countries, we had the challenge of connecting to more than 25 different eID providers across Europe. These providers use many different kind of connections, and we needed to make one simple API for our customers working cross-border. This resulted in using OpenID Connect as the customer facing API. The presentation will give an overview of the architecture, as well as showing the process we went through to get our current solution. I will present the challenges and lessons learned from this process. Implementing OpenID Connect at a large enterprise in 2020 is something of a unique experience as you have to chose between some relatively established off-the-shelf solutions, some established frameworks, and even some SaaS offerings. The flexibility and openness of the OpenID Connect and OAuth specifications also gives challenges in how to adopt the standards but also gives you the ability to tailor the solution to your specific use case. Our process is further complicated by the fact that the product already has relatively high usage: more than 25 different eID providers across Europe, more than over 1000 active customers, and we have over 1 million transactions per day. The presentation will cover the following: * How did we approach OpenID Connect the first time we tried? * What lessons did we learn from this attempt and why did it fail? * How did we then approach OpenID Connect the second time? * What did we do? * How did we decide on a way forward to implementation? * What has been some challenges? * What lessons have we learnt? * Way forward with OpenID Connect
Dag Sneeggen
Room: Plenary





OAuch: Analyzing the Security Best Practices in the OAuth 2.0 Ecosystem
OAuch is a security testing framework for OAuth 2.0 implementations. OAuth 2.0 implementations are semi-automatically tested using a large set of security-related tests. The tests are based on the requirements put forth by the original OAuth 2.0 specification, as well as several other documents that refine the security assumptions and requirements. OAuch also includes test sets specific to OpenID Connect (OIDC) and Financial-grade APIs. OAuch computes an overall score that gives an indication of how well the tested sites adheres to the security requirements of the standards. Vulnerabilities that are found or countermeasures that are missing in the implementation all contribute to a lower score. The impact of a failed test case on the resulting score depends on the requirement level as specified by the standard. For example, a missing countermeasure that is denoted as a 'MUST' in the standard will negatively impact the score more than a missing 'SHOULD' countermeasure. In addition to the score, the framework also generates a report with information about the failed test cases. This includes a description of each test case, a link to the (section in the) relevant standard, and a detailed log of the test. The log contains raw data such as HTTP requests and decoded JWT tokens, and allows the owner of the service to reconstruct the test. It is important to note that OAuch is a testing framework that tests compliance with the OAuth 2.0 specification. Although it focuses on security, a failed test does not necessarily imply that a vulnerability exists in the implementation. Hence, it should not be seen as a vulnerability scanner. That said, failed tests do indicate weak points in the implementation that may be exploitable. As such, an attacker could use OAuch to quickly determine which attack vectors (s)he should focus on. This talk presents the OAuch test framework. We will also give a first look at the results of our analysis of the OAuth/OIDC ecosystem, where we test a number of high-profile implementations and providers.
Pieter Philippaerts
Room: ROOM 2


Formal Security Analysis of Web Payment API
So far, many proprietary solutions exist to facilitate online sales processes and they all share the need to obtain customer information and to perform a corresponding financial transaction. The Web Payment APIs are a set of specifications by the W3C Web Payments Working Group that aim to offer new and improved checkout and payment mechanisms for the Web. As these specifications strive to become the new standard for Web payments, security is a crucial aspect. We have performed a rigorous, systematic formal analysis of the proposed Web Payment standards, based on the Web Infrastructure Model (WIM). The WIM is the most comprehensive formal model of the Web to date and serves as a framework to precisely analyze Web protocols, standards, and applications. During our analysis of the Web Payment specification, we discovered several vulnerabilities, one of which even allowed a malicious merchant to charge a customer multiple times for the same transaction. This security flaw is caused by an imprecision in the specification, which we discovered when trying to prove security. We verified that the attack works in practice and notified the W3C Web Payments Working Group as well as the Chromium developers about our findings. We also proposed fixes for the problem, which have been adopted in the meantime. We have also incorporated these fixes in our formal model and have then formally proved that the fixed standard is secure.
Tim Würtele
Room: Plenary


Consideration on Holder-of-Key Bound Token from FAPI implementer's perspective
We have been implementing FAPI Read Only API Security Profiles and FAPI Read and Write API Security Profiles to the Open Source Software. During this work, we have found some points to be considered for making FAPI Security Profile more beneficial. In this talk, we will pick up one of such points, "Sender Constraint Token". One of the methods to realize the sender constraint token is Holder-of-Key Bound and FAPI Read and Write API Security Profiles states that Authorization Code, Access Token and Refresh Token be Holder-of-Key bound. To make these tokens being Holder-of-Key bound, FAPI Read and Write API Security Profile states that OAuth 2.0 Token Binding [OAUTB] or OAuth 2.0 Mutual-TLS Client Authentication and Certificate-Bound Access Tokens [MTLS] are acceptable. At first, we pick up Holder-of-Key bound Authorization Code by [OAUTB] and [MTLS]. We discuss the applicability of each method and clarify whose key can be bound with the authorization code on each method by considering the nature of the technology each method adopts and the authorization code's nature theoretically. We also discuss whose key should be bound with the authorization code in the context of Authorization Code Grant and how to realize Holder-of-Key bound token which is not directly sent to the token receiver (namely, the client) from the token issuer (namely, the authorization server). Next, we pick up Holder-of-Key Bound Access Token and Refresh Token by [OAUTB] and [MTLS]. We discuss the applicability of each method. We also tell some issues we found when we had implemented the Holder-of-Key bound access token and refresh token to OSS by [MTLS] and discuss how to resolve them. Finally, we discuss the other method to realize the sender constraint token, that is "Client Bound Token" to get around issues when realizing the sender constraint token by Holder-of-Key bound authorization code, access token and refresh token.
Takashi Norimatsu
Room: Plenary


Formal Analysis of Mobile Multi-Factor Authentication with Single Sign-On Login
Over the last few years, there has been an almost exponential increase in the number of mobile applications that deal with sensitive data, such as applications for e-commerce or health. When dealing with sensitive data, classical authentication solutions based on username-password pairs are not enough, and multi-factor authentication solutions that combine two or more authentication factors of different categories are required instead. Even if several solutions are currently used, their security analyses have been performed informally or semiformally at best, and without a reference model and a precise definition of the multi-factor authentication property. This makes a comparison among the different solutions both complex and potentially misleading. In this talk, we first present the design of two reference models for native applications based on the requirements of two real-world use-case scenarios. Common features between them are the use of one-time password approaches and the support of a single sign-on experience. Then, we provide a formal specification of our threat model and the security goals, and discuss the automated security analysis that we performed. Our formal analysis validates the security goals of the two reference models we propose and provides an important building block for the formal analysis of different multi-factor authentication solutions.
Giada Sciarretta,Roberto Carbone
Room: Plenary
OAuth 2.0 meets Verifiable Credentials and Ethereum-based tokens
In this talk we will discuss the integration of W3C’s Verifiable Credentials (VCs) and blockchain-based tokens, into OAuth 2.0 workflow. Both these technologies encourage decentralization, facilitate self-sovereignty, and enable novel services. We will present a solution, which has been developed in the context of the H2020 project SOFIE that implements an OAuth 2.0 authorization server that uses VCs as “authorization grant” and supports the generation of JSON Web Tokens (JWT), complemented by blockchain-based ones. The use of VCs as an authorization grant offers some intriguing advantages. For instance, VCs facilitate access control enforcement since they encode the attributes of a “prover” in a machine readable and cryptographically verifiable format and allow “verifiers” to be pre-configured with “proof requests”, which can be easily evaluated without any knowledge of the underlay semantics. Furthermore, VCs can be used as privacy-preserving mechanisms and they facilitate interoperability. Similarly, blockchain-based tokens enable auditability and accountability, and they allow the modification of a JWT even after it has been issued (for example it can be revoked). This is achieved by recording auxiliary information in the blockchain, which is accessed at the time a JWT is validated by the resource server. Using this approach, the smart contract becomes an asynchronous communication channel between the authorization server and the resource server; clients do not have to be aware of this channel, therefore many of the proposed advantages are achieved even if clients are oblivious about the existence of the blockchain. Furthermore, blockchain-based tokens enable novel services, such as fair exchange and token delegation. We will discuss implementation and deployment issues, as well as performance measurements, based on our experience from implementing the proposed approach using Hyperledger Indy, an open source solution for generating and consuming VCs, supported by the Linux foundation, and Ethereum’s ERC-721, a specification for generating non-tangible tokens, already supported by many Ethereum “wallets”.
Nikos Fotiou
Room: ROOM 2