Tuesday, September 21, 2021

Centralized Logging Proposal

 Centralized Logging Proposal 

 

Requirements: 

1. Low latency/ Low impact to client performance. 

2. Must be scalable, able to handle a high number of requests in parallel 

3. Quickly able to process log events. 

4. Log events must contain metadata that is useful, meaning 

     a. What application is this from? 

     b. What client is this from? 

     c. What server did this come from? 

     d. What account/ user identity is involved? 

     e. What environment is this from? 

     f. Log level (eg. Error, Log, Warning, Info, etc.) 

     g. Meaningful Record Id information (whether it be a MemberIDCaseId, or some other unique identifier, if relevant) 

5. Universally accessible and queryable interface (cross platform accessibility not dependent on language or framework-specific libraries. This lends credibility to the api model. 

6. Low maintenance/high availability (recommended that we keep this in it’s own environment) 

 

Considerations: 

1. From a scalability aspect, this logging system should be engineered such that an increase in logging events/requests will not impact performance. A messaging queue such as Hangfire should be utilized in order to mitigate this issue. 

 

2. The logging database should exist in a single location. A single SQL server with client specific databases (one for each client). 

 

3. Log records should be archived after a certain time period. We may want to utilize another database for this purpose. 

 

4. Alerting. We may want the ability to provide standardized email alerts when a threshold for an error or event occurs. 

Authentication/Authorization

 


Authentication/Authorization Method Conclusion


To secure APIs and applications both internally and externally we need to understand what our requirements are, how these requirements impact our existing ecosystem, and what role these will play in future architectural design considerations. Industry standards recognize 3 main flows of Authentication/Authorization. These flows are SAML, OAuth2.0, and Open ID Connect (OIDC). These standards have been widely adopted and are constantly undergoing scrutiny and testing by major corporations and businesses. Each flow has specific benefits and some drawbacks and will be more beneficial and secure in certain environments.

At the most basic level, we have 5 different use case scenarios:

  1. External facing APIs

  2. External facing web applications that access internal/external APIs

  3. External facing mobile applications that access external APIs

  4. Internal facing APIs

  5. Internal facing web applications that access internal/external APIs


Currently we use a form of SAML authentication/authorization for Portal. Basically, we receive an encrypted SAML token, decrypt it with a stored certificate, parse it, and save that information to a database. From what we understand about this implementation, it does not follow any type of flow. It relies upon the fact that we store a certificate that we think is safe. SAML is known for cross-site scripting vulnerabilities and is rife with other security issues such as replay attacks (causing DDOS issues), man-in-the-middle attacks, and permission escalation attacks. To secure our internal infrastructure, it is recommended not to even use this type of authentication, or at least minimize its use.


Another option we have is OAuth2.0. Oauth2 is more or less, an authorization protocol that allows password-less cross-application authorization. This protocol allows federated authorization across an organization or several different organizations. There are three main players to this flow: 1. the user 2. the consumer 3. The service provider. The flow works like this:

  1. User requests a consumer to be able to do something on a 3rd party application or service.

  2. Consumer goes to the service provider and asks for permission.

  3. Service provider gives the consumer a token and a secret.

  4. Consumer sends the user over (with the token) to the service provider so they can approve.

  5. User gives the service provider the token and authorizes the consumer to have a certain set of permissions on the service provider on the user’s behalf.

  6. Consumer exchanges the request token for an access token and a secret

  7. Consumer uses the access token to talk to the service provider and perform actions on behalf of the user.


Within Oauth2, there are also different flows, that are slated for different scenarios.

  1. Machine-To-Machine Flow. In this flow, the client is the resource owner, and no end user authorization is needed. In this scenario, the client/resource owner holds the client ID and the client secret and uses them to get access from the authorization server.

  2. Authorization Code Flow. This flow is used if the client is a regular web app executing on a server. The client can retrieve an access token and an optional refresh token. It's considered the safest choice since the Access Token is passed directly to the web server hosting the Client, without going through the user's web browser and risking exposure.


  1. Resource Owner Password Credentials Grant. In this flow, the end-user is asked to fill in credentials (username/password), typically using an interactive form. This information is sent to the backend and from there to the authorization server. It is therefore imperative that the Client is absolutely trusted with this information.


  1. Authorization Code Flow with Proof Key for Code Exchange (PKCE). The PKCE-enhanced Authorization Code Flow introduces a secret created by the calling application that can be verified by the authorization server; this secret is called the Code Verifier. Additionally, the calling app creates a transform value of the Code Verifier called the Code Challenge and sends this value over HTTPS to retrieve an Authorization Code. This way, a malicious attacker can only intercept the Authorization Code, and they cannot exchange it for a token without the Code Verifier.

  2. Implicit Flow with Form Post. Implicit Flow with Form Post flow uses OIDC to implement web sign-in that is very similar to the way SAML and WS-Federation operates. The web app requests and obtains tokens through the front channel, without the need for secrets or extra backend calls. With this method, you don’t need to obtain, maintain, use, and protect a secret in your application.


  1. Hybrid Flow (OIDC). Applications that can securely store Client Secrets may benefit from the use of the Hybrid Flow (defined in section 3.3 of the OIDC spec), which allows your application to have immediate access to an ID token while still providing for secure and safe retrieval of access and refresh tokens. This can be useful in situations where your application needs to immediately access information about the user but must perform some processing before gaining access to protected resources for an extended period. This flow combines the standard implicit flow form post with the standard authorization code flow.


  1. Browser redirects to the Authorize endpoint of the OAuth Server.

  2. If the user isn’t authenticated the OAuth Server redirect to the Authentication Service.

  3. The User authenticates and is redirected back to the OAuth Server.

  4. The OAuth Server redirects back to the client with appropriate parameters in the response, based on the value of the response_type request parameter:

    • If code token was used, then the server returns the Authentication Code and an Access Token.

    • If code id_token was used, then the server returns the Authentication Code and an ID Token.

    • If code id_token token was used, the server returns the Authentication Code, an ID Token and an Access Token.


  1. The authorization code is sent to the token endpoint, like in in the Authorization Code flow.

  2. Backend tokens are returned - Access Token and ID Token. If the server settings permit, the Refresh Token is returned as well.


Open ID Connect is a thin layer built on top of OAuth2 that enables authentication. OIDC provides more information about the user. OIDC forces the use of JWT called an ID token. This is meant to standardize areas that OAuth2 leaves open. OIDC is specifically focused on user authentication and is widely used to enable user logins on consumer sites and mobile apps. With the ID token OIDC adds structure and predictability to allow otherwise different systems to interoperate and share authentication state and user profile information.

In summary, OIDC forces standardization of items that OAuth2 leaves open. Scopes, and claims are standardized in OIDC. OIDC standardizes these scopes to openid, profile, email, and address.

The identity information in the ID token is specifically intended to be read by 3rd party applications to authenticate the same identity across multiple web applications, a crucial component of federation.

In addition to the ID token, with the implementation of OpenID Connect comes standardized endpoints. In particular, the /userinfo endpoint allows for the verification of identity information metadata and is key to interoperability with other OpenID Connect systems suitable for enterprise grade solutions.

JWT (pronounced j-o-t) is a cryptographically signed JSON payload that stores the user information. Using JWT’s allows information to be verified and trusted with a digital signature. With this trusted digital signature in place the information can later be verified using a signing key. OpenID Connect utilizes the JWT standard for the ID token.


Given our current requirements for Authentication/Authorization, we should be utilizing OIDC Authorization Code Flow with PKCE for maximum security and industry standard Authentication/Authorization best practices. This will enable us to provide controlled, federated access across all our APIs and applications company wide. With this type of flow setup, we will be able to globally control users access to applications, their roles, which groups they belong to for both internal applications/APIs and external applications/APIs.

How we drive this setup will be up to us. We do not want to re-create the wheel by inventing our own tokens, authentication server, ID provider, etc. This would be disastrous in terms of time and resources, and we would not have enough knowledge to cover all the bases in terms of security. That said, it is recommended that we go with a certified OIDC provider (server and services) that bundles OAuth2 implementation.

Understanding that we do have certain instances where we receive a SAML token and use a 3rd party library to parse that token, we could easily translate that token to a JWT in our system. This an overall approach that would cover everything in our ecosystem. We would have one standard, one way that authentication occurs, and authorization is granted, rather than having disjoint systems that are prone to security holes as we do now. SAML is becoming obsolete and deprecated in use as more companies are realizing the problems and vulnerabilities associated with the technology. OIDC is the path that we need to adopt going forward. We have several options (both on prem and off prem for OIDC providers). Angular and C# both offer generic packages that facilitate hookups to the OIDC pipeline given their respective platform specific events and lifecycles. This allows decoupling and prevents dependence on a specific provider. That said, it will not be possible to encapsulate all OIDC into a simple API on our side. That implementation doesn’t make sense as you cannot decouple the OIDC flow from the language/platform specific lifecycles. That is the job of the generic packages employed by C# and Angular.

New Features in .Net 10

🚀 Runtime Enhancements Stack Allocation for Small Arrays The Just-In-Time (JIT) compiler now optimizes memory usage by stack-allocating s...