Securing Client Credentials Flow with Certificate

Sometimes there is application which requires background jobs to run without user interaction (though in most cases you can manage to avoid using App Permissions) In these particular exceptions using app permissions and client credentials might be justified, especially if you don’t want to maintain (and secure) long term storage of user refresh token for non user context operation.

Click the picture for larger version

Prevalent use of Shared Secrets in Client Credentials

I often see Client Credentials used with shared secret, and do understand that for some SaaS integrations with self service of on-boarding the shared secret is the easy way to on-board them. (Although uploading the private key for the particular app is not rocket science either, could be easily used in most of the cases)

Why Certificate is better than Shared Secret

Certs and Secrets show in Azure AD Application
  • Certificate Credentials never transmit the plain-text secret when requesting Access Tokens from Azure AD. Instead they transit JWT token which is signed with private key which the app holds. Verification is asymmetric, so Azure AD holds only the key which can assert that the JWT token came from the party in posession of the private key
  • Shared Secret in essence is weaker verification method (String vs Certificate)
  • You have more established ways to protect the certificate, than a single string

Alternatives to a certificate

  • You can use API management in conjunction with JWT-Bearer flow to gain better control of Shared Secret of 3rd parties.
    • This could be done by restricting caller IP’s and using several policies concerning the use Partners given access to Client Crendentials. In this scenario API management forwards the token once policies are checked
  • You could put short lived (few days) Client Secret in a Key Vault, and authorize the Key Vault answer only to a certain trusted IP’s… Still…once the plain text is exposed in the code run time the Client Secret is out of Key Vaults Domain until the client secret expires
    • Generally Client Secrets aren’t meant to be used as short lived
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/overview
  • … And if the API doesn’t need App Permissions, you can use Conditional Access…
  • You shouldn’t use real user service account as service account just get conditional access. Using a user principal instead of actual service principal opens another set of problems, which this blog is too short to delve on

Enable NodeJS API for Client Credentials with Certificate

Before proceeding using App Permissions, or Shared Secret in any other flow just check that your scenario is not one of the below Examples

  1. App Permissions used in flows that could’ve used delegated permissions using users token context
  2. Mobile Client Using Client Secret as part of Authorization Code Flow (Mobile client is not confidential client, lets just leave it there)
  3. (Not directly related, but a concern) App Permissions are required by multi-tenant app
    • In most scenarios you can create additional single tenant app, besides the registered multi-tenant to retain control of the shared secret (revokation etc).
      • Giving external multi-tenant app permissions is something you should think hard before proceeding in the first place

Create Certificate Credentials for existing App Registration

Pre-hardening

  • Ensure Application doesn’t have any redirect URI’s. This effectively ensures no user sign in process can get tokens returned for the Application
  • Remove the default delegated user permissions from the app
  • Ensure Implicit Grant isn’t enabled in the application (This wouldn’t work any way with the user permissions removed to sign-in and read user profile , but we do some additional cleaning here)
  • Remove Any password credentials app might have (obviously, if they are used production, dont remove them until the flow is updated to use certificate in code for these apps)

Pre-reqs

  • OpenSSL binaries
  • Azure AD PowerShell Module
$Subject = "CN=" + "DemoApplication2"
$Expiration = (get-date).AddYears(2)
$pass = Read-Host -Prompt "PFX exporting Password"
$cert = New-SelfSignedCertificate -CertStoreLocation "Cert:\CurrentUser\My" -Subject $Subject -KeySpec KeyExchange -NotAfter $Expiration
$AADKeyValue = [System.Convert]::ToBase64String($cert.GetRawCertData())
$cert| Export-PfxCertificate -FilePath (($Subject -split "=")[1] + ".pfx") -Password ($pass | ConvertTo-SecureString -AsPlainText -Force) 
#Navigate to directory where OpenSSL is installed
.\openssl.exe pkcs12 -in (($Subject -split "=")[1] + ".pfx") -passin "pass:$pass"  -out (($Subject -split "=")[1] + ".pem") -nodes
$data = get-content (($Subject -split "=")[1] + ".pem")
$data[$data.IndexOf("-----BEGIN PRIVATE KEY-----")..$data.IndexOf("-----END PRIVATE KEY-----")] | Out-File (($Subject -split "=")[1] + ".pem") -encoding "DEFAULT"
connect-azuread
$application = New-AzureADApplication -DisplayName ($Subject -split "=")[1]
New-AzureADApplicationKeyCredential -ObjectId $application.ObjectId -CustomKeyIdentifier ($Subject -split "=")[1] -Type AsymmetricX509Cert -Usage Verify -Value $AADKeyValue -EndDate ($Expiration | get-date -format "dd.M.yyyy" )

NodeJS Code Example using ADAL and Certificate Credentials

  • In the code fill the thumprint, clientID (appid), and your tenant name
//Gets Access Token by sending private key signed JWT Token to Azure AD Token Endpoint 
//https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-certificate-credentials
//https://www.npmjs.com/package/adal-node
const {AuthenticationContext,Logging} = require("adal-node")
//set logging level to verbose, and print to console 
Logging.setLoggingOptions({
    log: (level, message, error) => {
      console.log(message)
    },
    level: Logging.LOGGING_LEVEL.VERBOSE,
    loggingWithPII: true
  });
//Token Request Options
  var options = {
    URLwithTenant:"https://login.windows.net/dewired.onmicrosoft.com",
    resource:"https://graph.microsoft.com", 
    applicationId : "c81ea829-d488-46c2-8838-68dde9052478",
    certThumbPrint:"72B807BE590A73E0B88947C902C5C58E22344C5F"
  }
//Construct the Authentication Context
var context = new AuthenticationContext(options.URLwithTenant);
//Read Certificate from buffer to UTF8 string 
const fs = require("fs")
var keyStringFromBuffer = fs.readFileSync("DemoApplication2.pem").toString("UTF8")
console.log(keyStringFromBuffer)
//acquireToken
context.acquireTokenWithClientCertificate(options.resource,options.applicationId,keyStringFromBuffer,options.certThumbPrint,(error,tokenresponse) => {
console.log(error,tokenresponse)
})

In the end you should see verbose message with token returned in the callback for tokenResponse.

If you see error about the self signed cert, ensure that all localization settings match UTF8, and that there are no empty space characters in the PEM file. If you still see the error, copy the Private Key manually from the openSSL created file

Br, Joosua

Azure AD App Proxy|Forward incoming JWT token to backend service: What are my choices?

Currently there is feature request in feedback.azure.com for getting JWT Tokens forwarded into the back end.

https://feedback.azure.com/forums/169401-azure-active-directory/suggestions/32386468-forward-incoming-jwt-token-to-backend-service

There are at least two scenarios for such request. I am taking some shortcuts here, and assuming that in most scenarios this is an access token, similar to the one that is issued and sent with Native Clients in Authorization: Bearer … header

  • Browser clients using XHR / Fetch
    • This seems to work ”out-of-the-box” as the browser session is ”authenticated” with session data held in the session persisting cookies
    • I am using an example where the back-end service supplies the client per client side chained fetch() requests with any Access Token. This token is then sent back to back-end and displayed in the back end. This is to prove, that Azure AD application Proxy doesn’t strip the bearer token form Authorization header
function getToken () {
    fetch('/refreshToken').then((response) => {
        response.json().then( (data) => {
        console.log(data['access_token'])
        var token = data['access_token']
        fetch('/caller',{
            headers: {
                'Authorization': 'Bearer ' + token
                // 'Content-Type': 'application/x-www-form-urlencoded',
              },
        }).then( (response) => {
        response.json().then((data2) => {
            console.log(data2)
                })
            })
        })
    })
}
Authorization header is contained, and can thus be received in the back end
  • Native clients outside of web view sending the Access Token destined for AppProxy itself
    • I am excluding native client using web view like scenario where a browser is ”conjured” in the app. In which case I’d assume that web view would behave similarly as the browser example (mostly?) and successfully send the token to the back-end
    • This doesn’t work (And per explanation in the feature request, that’s by design), but alternative ways are available, which I’ve previously explored in another post
There is no Authorization Header. I’ve added extra header for (Authorization2) for illustrative purposes

NodeJS Logging integration with Azure Log Analytics/Sentinel

If you want to send data from NodeJS application to Log Analytics/Sentinel you can do it by using the HTTP Log Collector API.

Sending data to Sentinel Connected Log Analytics WorkSpace as part of incoming request callback

Note: If your app is in Azure PaaS solution, you should check out AppInsights first before going to this route 🙂

Writing module for the Log Collector API

There we’re some existing examples to do this, but I couldn’t get them to work in quick breeze. Due to this I did my own implementation with some key differences:

Signature generation part is done in two phases to improve readability

  • Basically I separated the creation of buffer shared key to base64 into an separate variable (var)

Function is bit different with callbacks and try catch logic added

Request Module will handle the Body Payload as non stringified

I did find, that If I sent the body payload stringified, it wouldnt match with the signature. To get the signature to match with the body payload, I added the request option json:true, and sent the non-stringified JSON payload.

The module to be imported

//https://nodejs.org/api/crypto.html
//https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
//https://stackoverflow.com/questions/44532530/encoding-encrypting-the-azure-log-analytics-authorization-header-in-node-js
const rq = require('request')
const crypto = require('crypto')
const util = require('util')
function PushToAzureLogs (content,{id,key,rfc1123date,LogType}, callback) {
    console.log(id)
    try {
        //Checking if the data can be parsed as JSON
        if ( JSON.parse(JSON.stringify(content)) ) {
            var length = Buffer.byteLength(JSON.stringify(content),'utf8')
            var binaryKey = Buffer.from(key,'base64')
            var stringToSign = 'POST\n' + length + '\napplication/json\nx-ms-date:' + rfc1123date + '\n/api/logs';
            //console.log(stringToSign)
    
            var hash = crypto.createHmac('sha256',binaryKey)
            .update(stringToSign,'utf8')
            .digest('base64')
            var authorization = "SharedKey "+id +":"+hash
            var options= {
            json:true,
            headers:{
            "content-type": "application/json", 
            "authorization":authorization,
            "Log-Type":LogType,
            "x-ms-date":rfc1123date,
            "time-generated-field":"DateValue"
            },
            body:content    
            }
            var uri = "https://"+ id + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01"
    
            rq.post(uri,options,(err,Response) => {
                //return if error inside try catch block 
                if (err) {
                    return callback(("Not data sent to LA: " + err))
                }
               callback(("Data sent to LA " +util.inspect(content) + "with status code " + Response.statusCode))
    
            })
    
        }
        //Catch error if data cant be parsed as JSON
    } catch (err) {
        callback(("Not data sent to LA: " + err))
    }
           
}
module.exports={PushToAzureLogs}

Example from ExpressJS

//Add your other dependencies before this
const logs = require('./SRC/laws')
//define workspace details
const laws = {
    id:'yourID',
  key:'yourKey',
    rfc1123date:(new Date).toUTCString(),
    LogType:'yourLogType'
}
app.get('/graph', (request,response) => {
//not related to LA, this the data I am sending to LA
    var token = mods.readToken('rt').access_token
    mods.apiCall(token,'https://graph.microsoft.com/v1.0/me?$select=displayName,givenName,onPremisesSamAccountName', (data) => {
    console.log('reading graph', data)
//LA object
    jsonObject = {
        WAFCaller:request.hostname,
        identity:data.displayName,
        datasource:request.ip
    }
    console.log(jsonObject)
//send data to LA
        logs.PushToAzureLogs(jsonObject,laws,(data)=> {
            console.log(data)
        })
//return original response
    response.send(data)
    })
})

Once the data is sent, it will take about 5-10 minutes, for the first entries to be popping up

If /when you attach the Log Analytics workspace to Sentinel, you can then use it create your own hunting queries, and combine the data you have with TI-feeds etc

Happy hunting!

Add sAMAccountName to Azure AD Access Token (JWT) with Claims Mapping Policy (and avoiding AADSTS50146)

With the possibilities available (and quite many of blogs) regarding the subject), I cant blame anyone for wondering whats the right way to do this. At least I can present one way that worked for me

Here are the total ways to do it (1. obviously not the JWT token)

  1. With SAML federations you have full claims selection in GUI
  2. Populate optional claims to the API in app registration manifest, given you’ve updated the schema for the particular app
  3. Create custom Claims Policy, to choose emitted claims (The option we’re exploring here)
  4. Query the directory extension claims from Microsoft Graph API appended in to the directory schema extension app* that Graph API can call

Please note, for sAMAccountName we’re not using the approach where we add directory extensions to Graph API queryable application = NO DIRECTORY EXTENSION SYNC IN AAD CONNECT NEEDED


Checklist for using Claims Mapping Policy

Pre: Have Client application, and web API ready before proceeding

#Example App to Add the Claims 
AzureADPreview\Connect-AzureAD
$Definition = [ordered]@{
    "ClaimsMappingPolicy" = [ordered]@{
        "Version" = 1
        "IncludeBasicClaimSet" = $true
        "ClaimsSchema" = @(
            [ordered]@{
                "Source" = "user"
                "ID" = "onpremisessamaccountname"
                "JwtClaimType" = "onpremisessamaccountname"
            }
        )
    }
}
$pol =  New-AzureADPolicy -Definition ($definition | ConvertTo-Json -Depth 3) -DisplayName ("Policy_" + ([System.Guid]::NewGuid().guid) + "_" + $template.Values.claimsschema.JwtClaimType) -Type "ClaimsMappingPolicy" 
 
$entApp =  New-AzureADApplication -DisplayName  ("DemoApp_" + $template.Values.claimsschema.JwtClaimType)
$spnob =  New-AzureADServicePrincipal -DisplayName $entApp.DisplayName -AppId $entApp.AppId 
Add-AzureADServicePrincipalPolicy -Id $spnob.ObjectId -RefObjectId $pol.Id 
#From the GUI change the Identifier and acceptMappedClaims value (From the legacy experience)

  • Generally: The app that will emit the claims is not the one you use as the clientID (Client subscribing to the Audience)
    • Essentially you should create un-trusted client with clientID, and then add under Api permissions the audience/resource you’re using
  • Ensure that SPN has IdentifierURI that matches registered custom domain in the tenant
    • The reasoning is vaguely explained here & here
      • Whatever research work the feedback senders did, it sure looked in depth 🙂
  • Update the app manifest to Accept Mapped Claims
    • Do this in the legacy experience, the new experience at least in my tenant didn’t support updating this particular value
”Insufficient privileges to complete the operation”

if mapped claims are not accepted in manifest, and pre-requisites are not satisfied you might get this error

”AADSTS50146: This application is required to be configured with an application-specific signing key. It is either not configured with one, or the key has expired or is not yet valid. Please contact the application’s administrator.”

  • Below is example for the Manifest changes (AcceptMappedClaims, and verified domain matching URI)
     "id": "901e4433-88a9-4f76-84ca-ddb4ceac8703",
    "acceptMappedClaims": true,
    "accessTokenAcceptedVersion": null,
    "addIns": [],
    "allowPublicClient": null,
    "appId": "9bcda514-7e6a-4702-9a0a-735dfdf248fd",
    "appRoles": [],
    "oauth2AllowUrlPathMatching": false,
    "createdDateTime": "2019-06-05T17:37:58Z",
    "groupMembershipClaims": null,
    "identifierUris": [
        "https://samajwt.dewi.red"
    ],

Testing

If you’re planning to use non-verified domain based identifier

”AADSTS501461: AcceptMappedClaims is only supported for a token audience matching the application GUID or an audience within the tenant’s verified domains.

References

https://github.com/MicrosoftDocs/azure-docs/issues/5394

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/develop/active-directory-claims-mapping.md

https://github.com/Azure-Samples/active-directory-dotnet-daemon-certificate-credential#create-a-self-signed-certificate

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/develop/v2-protocols-oidc.md#fetch-the-openid-connect-metadata-document

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/develop/active-directory-claims-mapping.md#example-create-and-assign-a-policy-to-include-the-employeeid-and-tenantcountry-as-claims-in-tokens-issued-to-a-service-principal

Decode JWT access and id tokens via PowerShell