Securing Client Credentials Flow with Certificate

Sometimes there is application which requires background jobs to run without user interaction (though in most cases you can manage to avoid using App Permissions) In these particular exceptions using app permissions and client credentials might be justified, especially if you don’t want to maintain (and secure) long term storage of user refresh token for non user context operation.

Click the picture for larger version

Prevalent use of Shared Secrets in Client Credentials

I often see Client Credentials used with shared secret, and do understand that for some SaaS integrations with self service of on-boarding the shared secret is the easy way to on-board them. (Although uploading the private key for the particular app is not rocket science either, could be easily used in most of the cases)

Why Certificate is better than Shared Secret

Certs and Secrets show in Azure AD Application
  • Certificate Credentials never transmit the plain-text secret when requesting Access Tokens from Azure AD. Instead they transit JWT token which is signed with private key which the app holds. Verification is asymmetric, so Azure AD holds only the key which can assert that the JWT token came from the party in posession of the private key
  • Shared Secret in essence is weaker verification method (String vs Certificate)
  • You have more established ways to protect the certificate, than a single string

Alternatives to a certificate

  • You can use API management in conjunction with JWT-Bearer flow to gain better control of Shared Secret of 3rd parties.
    • This could be done by restricting caller IP’s and using several policies concerning the use Partners given access to Client Crendentials. In this scenario API management forwards the token once policies are checked
  • You could put short lived (few days) Client Secret in a Key Vault, and authorize the Key Vault answer only to a certain trusted IP’s… Still…once the plain text is exposed in the code run time the Client Secret is out of Key Vaults Domain until the client secret expires
    • Generally Client Secrets aren’t meant to be used as short lived
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/overview
  • … And if the API doesn’t need App Permissions, you can use Conditional Access…
  • You shouldn’t use real user service account as service account just get conditional access. Using a user principal instead of actual service principal opens another set of problems, which this blog is too short to delve on

Enable NodeJS API for Client Credentials with Certificate

Before proceeding using App Permissions, or Shared Secret in any other flow just check that your scenario is not one of the below Examples

  1. App Permissions used in flows that could’ve used delegated permissions using users token context
  2. Mobile Client Using Client Secret as part of Authorization Code Flow (Mobile client is not confidential client, lets just leave it there)
  3. (Not directly related, but a concern) App Permissions are required by multi-tenant app
    • In most scenarios you can create additional single tenant app, besides the registered multi-tenant to retain control of the shared secret (revokation etc).
      • Giving external multi-tenant app permissions is something you should think hard before proceeding in the first place

Create Certificate Credentials for existing App Registration

Pre-hardening

  • Ensure Application doesn’t have any redirect URI’s. This effectively ensures no user sign in process can get tokens returned for the Application
  • Remove the default delegated user permissions from the app
  • Ensure Implicit Grant isn’t enabled in the application (This wouldn’t work any way with the user permissions removed to sign-in and read user profile , but we do some additional cleaning here)
  • Remove Any password credentials app might have (obviously, if they are used production, dont remove them until the flow is updated to use certificate in code for these apps)

Pre-reqs

  • OpenSSL binaries
  • Azure AD PowerShell Module
$Subject = "CN=" + "DemoApplication2"
$Expiration = (get-date).AddYears(2)
$pass = Read-Host -Prompt "PFX exporting Password"
$cert = New-SelfSignedCertificate -CertStoreLocation "Cert:\CurrentUser\My" -Subject $Subject -KeySpec KeyExchange -NotAfter $Expiration
$AADKeyValue = [System.Convert]::ToBase64String($cert.GetRawCertData())
$cert| Export-PfxCertificate -FilePath (($Subject -split "=")[1] + ".pfx") -Password ($pass | ConvertTo-SecureString -AsPlainText -Force) 
#Navigate to directory where OpenSSL is installed
.\openssl.exe pkcs12 -in (($Subject -split "=")[1] + ".pfx") -passin "pass:$pass"  -out (($Subject -split "=")[1] + ".pem") -nodes
$data = get-content (($Subject -split "=")[1] + ".pem")
$data[$data.IndexOf("-----BEGIN PRIVATE KEY-----")..$data.IndexOf("-----END PRIVATE KEY-----")] | Out-File (($Subject -split "=")[1] + ".pem") -encoding "DEFAULT"
connect-azuread
$application = New-AzureADApplication -DisplayName ($Subject -split "=")[1]
New-AzureADApplicationKeyCredential -ObjectId $application.ObjectId -CustomKeyIdentifier ($Subject -split "=")[1] -Type AsymmetricX509Cert -Usage Verify -Value $AADKeyValue -EndDate ($Expiration | get-date -format "dd.M.yyyy" )

NodeJS Code Example using ADAL and Certificate Credentials

  • In the code fill the thumprint, clientID (appid), and your tenant name
//Gets Access Token by sending private key signed JWT Token to Azure AD Token Endpoint 
//https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-certificate-credentials
//https://www.npmjs.com/package/adal-node
const {AuthenticationContext,Logging} = require("adal-node")
//set logging level to verbose, and print to console 
Logging.setLoggingOptions({
    log: (level, message, error) => {
      console.log(message)
    },
    level: Logging.LOGGING_LEVEL.VERBOSE,
    loggingWithPII: true
  });
//Token Request Options
  var options = {
    URLwithTenant:"https://login.windows.net/dewired.onmicrosoft.com",
    resource:"https://graph.microsoft.com", 
    applicationId : "c81ea829-d488-46c2-8838-68dde9052478",
    certThumbPrint:"72B807BE590A73E0B88947C902C5C58E22344C5F"
  }
//Construct the Authentication Context
var context = new AuthenticationContext(options.URLwithTenant);
//Read Certificate from buffer to UTF8 string 
const fs = require("fs")
var keyStringFromBuffer = fs.readFileSync("DemoApplication2.pem").toString("UTF8")
console.log(keyStringFromBuffer)
//acquireToken
context.acquireTokenWithClientCertificate(options.resource,options.applicationId,keyStringFromBuffer,options.certThumbPrint,(error,tokenresponse) => {
console.log(error,tokenresponse)
})

In the end you should see verbose message with token returned in the callback for tokenResponse.

If you see error about the self signed cert, ensure that all localization settings match UTF8, and that there are no empty space characters in the PEM file. If you still see the error, copy the Private Key manually from the openSSL created file

Br, Joosua

Measuring Node Execution In VSCode with PowerShell

Background

Forcing synchronous execution in NodeJS some times increases execution time of the function, because you want to ensure certain order of progress in the code.

There are many good reads into the subject. TL DR: In my opinion you can reduce the amount of code quite drastically in particular select cases by introducing synchronous behavior into your code.

Timing functions

Generally you can measure functions execution time setTimeOut() and using dates for calculation. There are also many wrapper solutions for much more fine grained measurements/telemetry etc.

Since PowerShell is my default console for VScode I tend to use native functions of Powershell often to speed up some things – And this is where the Measure-Command function comes into play

Non blocking function

Blocking function

//Non Blocking function
const req = require('request-promises')
var uris =[
    "https://uri1.com",
    "https://uri2.com",
    "https://uri3.com",
    "https://uri4.com"
]
 function LoopAsync (uris) {
    for (let index = 0; index < uris.length; index++) {
        const element = uris[index];
        req(element).then((result) => {
            console.log(element + ".>" + result.headers.server)
        })
    }
}
LoopAsync(uris)

//Blocking function
const req = require('request-promises')
var uris =[
    "https://uri1.com",
    "https://uri2.com",
    "https://uri3.com",
    "https://uri4.com"
]
async function LoopAsyncAwait (uris) {
    for (let index = 0; index < uris.length; index++) {
        const element = uris[index];
        const data = await req(element)
        console.log(element + ".>" + data.headers.server)
    }
}
LoopAsyncAwait(uris)

Example for VScode

Measure-Command {node .\AsyncLoop.js}
Measure-Command {node .\syncLoop.js}

NodeJS Logging integration with Azure Log Analytics/Sentinel

If you want to send data from NodeJS application to Log Analytics/Sentinel you can do it by using the HTTP Log Collector API.

Sending data to Sentinel Connected Log Analytics WorkSpace as part of incoming request callback

Note: If your app is in Azure PaaS solution, you should check out AppInsights first before going to this route 🙂

Writing module for the Log Collector API

There we’re some existing examples to do this, but I couldn’t get them to work in quick breeze. Due to this I did my own implementation with some key differences:

Signature generation part is done in two phases to improve readability

  • Basically I separated the creation of buffer shared key to base64 into an separate variable (var)

Function is bit different with callbacks and try catch logic added

Request Module will handle the Body Payload as non stringified

I did find, that If I sent the body payload stringified, it wouldnt match with the signature. To get the signature to match with the body payload, I added the request option json:true, and sent the non-stringified JSON payload.

The module to be imported

//https://nodejs.org/api/crypto.html
//https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
//https://stackoverflow.com/questions/44532530/encoding-encrypting-the-azure-log-analytics-authorization-header-in-node-js
const rq = require('request')
const crypto = require('crypto')
const util = require('util')
function PushToAzureLogs (content,{id,key,rfc1123date,LogType}, callback) {
    console.log(id)
    try {
        //Checking if the data can be parsed as JSON
        if ( JSON.parse(JSON.stringify(content)) ) {
            var length = Buffer.byteLength(JSON.stringify(content),'utf8')
            var binaryKey = Buffer.from(key,'base64')
            var stringToSign = 'POST\n' + length + '\napplication/json\nx-ms-date:' + rfc1123date + '\n/api/logs';
            //console.log(stringToSign)
    
            var hash = crypto.createHmac('sha256',binaryKey)
            .update(stringToSign,'utf8')
            .digest('base64')
            var authorization = "SharedKey "+id +":"+hash
            var options= {
            json:true,
            headers:{
            "content-type": "application/json", 
            "authorization":authorization,
            "Log-Type":LogType,
            "x-ms-date":rfc1123date,
            "time-generated-field":"DateValue"
            },
            body:content    
            }
            var uri = "https://"+ id + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01"
    
            rq.post(uri,options,(err,Response) => {
                //return if error inside try catch block 
                if (err) {
                    return callback(("Not data sent to LA: " + err))
                }
               callback(("Data sent to LA " +util.inspect(content) + "with status code " + Response.statusCode))
    
            })
    
        }
        //Catch error if data cant be parsed as JSON
    } catch (err) {
        callback(("Not data sent to LA: " + err))
    }
           
}
module.exports={PushToAzureLogs}

Example from ExpressJS

//Add your other dependencies before this
const logs = require('./SRC/laws')
//define workspace details
const laws = {
    id:'yourID',
  key:'yourKey',
    rfc1123date:(new Date).toUTCString(),
    LogType:'yourLogType'
}
app.get('/graph', (request,response) => {
//not related to LA, this the data I am sending to LA
    var token = mods.readToken('rt').access_token
    mods.apiCall(token,'https://graph.microsoft.com/v1.0/me?$select=displayName,givenName,onPremisesSamAccountName', (data) => {
    console.log('reading graph', data)
//LA object
    jsonObject = {
        WAFCaller:request.hostname,
        identity:data.displayName,
        datasource:request.ip
    }
    console.log(jsonObject)
//send data to LA
        logs.PushToAzureLogs(jsonObject,laws,(data)=> {
            console.log(data)
        })
//return original response
    response.send(data)
    })
})

Once the data is sent, it will take about 5-10 minutes, for the first entries to be popping up

If /when you attach the Log Analytics workspace to Sentinel, you can then use it create your own hunting queries, and combine the data you have with TI-feeds etc

Happy hunting!

Azure AD – Add Custom claims for WS-Federation applications

Disclaimer: The information in this weblog is provided “AS IS” with no warranties and confers no rights.

Most guides using Azure AD as IDP focus at OAuth and OAuth2/w OIDC flows for API access, and for enterprise SSO SAML. Not much endorsement for WS-Federation, and that’s understandable because the two previous options cover pretty much every scenario you would ideally have. But what if you have a app that is using explicitly WS-Federation? For example Microsoft OWIN based application?

There aren’t good support articles on using WS-Federation and custom claims in Azure AD when Azure AD is IDP, but WS-Federation is definitely supported when Azure AD is acting as IDP .

The only article I found covering custom claims and mentioning WS-Federation was the one I’ve used to write my previous article on preview of custom claims (how to get sAMAccountName into JWT tokens)

Since Azure AD App’s are largely protocol agnostic I figured that maybe the claims mappings are somewhat ambiguous also, just as long you understand how custom claims, and federations work

Guide to testing

The order here is important. While SAML endpoint is enabled by default in any Azure AD App, any specific settings, especially one’s for configuring Service Provider aren’t directly available in the ”non-SAML” app you create – Unless you play around with the App Registration Manifest. But there are other features that require spawning app initially as SAML app (For example using custom token sign-in cert, using token encryption and so on)

  • Configure new SAML federation at Azure AD
  • Add the custom claims for the federation
  • Set Single-Sign On mode to disabled
    • If you’re planning to use this in production, you have get the Metadata from SAML federation settings, I will do some further investigation later to update this blog
  • For metadata replace the metadata URL with tenant and AppID details

Test the federation.

I have handy NodeJS Express App I use pretty much for testing and debugging any authorization related stuff, with very simple purpose of catching different inbound calls.

About the code… The code itself is epitome of laziness, so I am too embarrassed to share it :)…

  • Invoke the call
    • Use AppID or any Identifier you’ve defined in the manifest
    • If you have multiple reply URL’s define the ’wreply’ param
https://login.microsoftonline.com/common/wsfed?
 wtrealm=1de8c976-fe00-4008-91d5-e5a2381d40a6
 &wctx=WSfedState
 &wa=wsignin1.0
  • Confirm the emitted claims
    • In this particular case its user.displayname +(join) user.dnsdomain
Typical WS-Federation body

There are many questions and details that I could add here, but since I am just fooling around with a feature, I am not devoting any more time until there is actual production need for such scenario.