Experimental – Using Azure Function Proxy as Authenticating Reverse Proxy for NodeJS Docker App

Disclaimer: Azure Function Proxies are meant to act as proxies for functions itself, and as aggregators of microservice style resources/API’s near the function proximity. If you need an actual reverse proxy, or full blown API gateway, then solutions such as Azure API management, Azure AD App Proxy, Azure App GW, Kemp VLM, or just placing NGINX on your container might be the the right pick.

Now as the disclaimer is out of the way I can continue experimenting with this excellent feature without of having any kind of business justification

My inspiration came from MS’s similar article which covers using function proxy route to publish a certain part from wordpress. My angle was to see if the same approach can be used with App Service Authentication.

Obvious caveats

  • This is not necessarily the way this feature intended to be used 🙂
  • cold start of any function type solution. (Maybe do the same with App service web app)
  • If you are running docker image, then why not run it in the app service in the first place?
    • If the app is something else than docker image and likes to live on a VM, then this approach might still be of interest

Obvious benefits

  • Deploy your reverse proxy, or API gateway and rules of the solution as code
    • Functions is not the only solution to support this approach certainly, but functions integrate with VScode and CI/CD solutions. You end up having your solution entirely defined as re-deployable code)
    • Setting reverse proxy rules as example
  • Alternative approach for Single Page App /Static website, where function is acting as middle-end aggregator for certain tasks that are better handled outside of the browser due to possible security concerns
    • Don’t get me wrong here… I believe you can make perfectly secure SPA’s, and looking at JAMStack, and new Azure Static Web Sites offering, it seems that we are also heading that way 🙂

Background

Test environment

  • Azure VM
    • running NodeJS Express app docker image baked in VSCode’s insanely good docker extension environment
    • In the same VNET as the App Service Plan
  • Function
    • In the same VNET as the Azure VM running the docker image

Test results

  • Sign in to the application works on fresh authentication
    • After fresh authentication the session is maintained by app service cookies
  • When there was existing session on Azure AD the authorization flow for this App Resulted in HTTP error 431.
    • If there was actual use scenario I would debug this further and possibly create another re-directing function to ingest the token which would drop the proper cookie for the subsequent sign in
  • I haven’t tested if there are possible issues with advanced content types, I would expect that the proxy function forwards the back-end responses content-type (maybe test for another blog)
  • From the TCPDump trace running the DockerVM you can see the internal IP of the App Service
    • 07:22:53.754245 IP 172.30.10.29.54044 > 172.30.10.36.8080: Flags [.], ack 218, win 221, options [nop,nop,TS val 104639808 ecr 1486010770], length 0

Ideas for next blog?

Some delicious continuation tests for this approach could be:

  • Based on the internal headers created by the EasyAuth module:
    • Create poc for Native and Single Page Apps using Authorization Header
    • Create test scenario for using internal B2C authentication (I have app ready for this)
    • Add internal proxy routes to perform further authorization rules
    • Forward Authentication tokens, or username headers to the docker back-end application by defining the proxy application as external redirect target, or by using the internal function methods
https://docs.microsoft.com/en-us/azure/app-service/overview-authentication-authorization

Till next time

Joosua

Azure AD App Proxy|Forward incoming JWT token to backend service: What are my choices?

Currently there is feature request in feedback.azure.com for getting JWT Tokens forwarded into the back end.

https://feedback.azure.com/forums/169401-azure-active-directory/suggestions/32386468-forward-incoming-jwt-token-to-backend-service

There are at least two scenarios for such request. I am taking some shortcuts here, and assuming that in most scenarios this is an access token, similar to the one that is issued and sent with Native Clients in Authorization: Bearer … header

  • Browser clients using XHR / Fetch
    • This seems to work ”out-of-the-box” as the browser session is ”authenticated” with session data held in the session persisting cookies
    • I am using an example where the back-end service supplies the client per client side chained fetch() requests with any Access Token. This token is then sent back to back-end and displayed in the back end. This is to prove, that Azure AD application Proxy doesn’t strip the bearer token form Authorization header
function getToken () {
    fetch('/refreshToken').then((response) => {
        response.json().then( (data) => {
        console.log(data['access_token'])
        var token = data['access_token']
        fetch('/caller',{
            headers: {
                'Authorization': 'Bearer ' + token
                // 'Content-Type': 'application/x-www-form-urlencoded',
              },
        }).then( (response) => {
        response.json().then((data2) => {
            console.log(data2)
                })
            })
        })
    })
}
Authorization header is contained, and can thus be received in the back end
  • Native clients outside of web view sending the Access Token destined for AppProxy itself
    • I am excluding native client using web view like scenario where a browser is ”conjured” in the app. In which case I’d assume that web view would behave similarly as the browser example (mostly?) and successfully send the token to the back-end
    • This doesn’t work (And per explanation in the feature request, that’s by design), but alternative ways are available, which I’ve previously explored in another post
There is no Authorization Header. I’ve added extra header for (Authorization2) for illustrative purposes

NodeJS Logging integration with Azure Log Analytics/Sentinel

If you want to send data from NodeJS application to Log Analytics/Sentinel you can do it by using the HTTP Log Collector API.

Sending data to Sentinel Connected Log Analytics WorkSpace as part of incoming request callback

Note: If your app is in Azure PaaS solution, you should check out AppInsights first before going to this route 🙂

Writing module for the Log Collector API

There we’re some existing examples to do this, but I couldn’t get them to work in quick breeze. Due to this I did my own implementation with some key differences:

Signature generation part is done in two phases to improve readability

  • Basically I separated the creation of buffer shared key to base64 into an separate variable (var)

Function is bit different with callbacks and try catch logic added

Request Module will handle the Body Payload as non stringified

I did find, that If I sent the body payload stringified, it wouldnt match with the signature. To get the signature to match with the body payload, I added the request option json:true, and sent the non-stringified JSON payload.

The module to be imported

//https://nodejs.org/api/crypto.html
//https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
//https://stackoverflow.com/questions/44532530/encoding-encrypting-the-azure-log-analytics-authorization-header-in-node-js
const rq = require('request')
const crypto = require('crypto')
const util = require('util')
function PushToAzureLogs (content,{id,key,rfc1123date,LogType}, callback) {
    console.log(id)
    try {
        //Checking if the data can be parsed as JSON
        if ( JSON.parse(JSON.stringify(content)) ) {
            var length = Buffer.byteLength(JSON.stringify(content),'utf8')
            var binaryKey = Buffer.from(key,'base64')
            var stringToSign = 'POST\n' + length + '\napplication/json\nx-ms-date:' + rfc1123date + '\n/api/logs';
            //console.log(stringToSign)
    
            var hash = crypto.createHmac('sha256',binaryKey)
            .update(stringToSign,'utf8')
            .digest('base64')
            var authorization = "SharedKey "+id +":"+hash
            var options= {
            json:true,
            headers:{
            "content-type": "application/json", 
            "authorization":authorization,
            "Log-Type":LogType,
            "x-ms-date":rfc1123date,
            "time-generated-field":"DateValue"
            },
            body:content    
            }
            var uri = "https://"+ id + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01"
    
            rq.post(uri,options,(err,Response) => {
                //return if error inside try catch block 
                if (err) {
                    return callback(("Not data sent to LA: " + err))
                }
               callback(("Data sent to LA " +util.inspect(content) + "with status code " + Response.statusCode))
    
            })
    
        }
        //Catch error if data cant be parsed as JSON
    } catch (err) {
        callback(("Not data sent to LA: " + err))
    }
           
}
module.exports={PushToAzureLogs}

Example from ExpressJS

//Add your other dependencies before this
const logs = require('./SRC/laws')
//define workspace details
const laws = {
    id:'yourID',
  key:'yourKey',
    rfc1123date:(new Date).toUTCString(),
    LogType:'yourLogType'
}
app.get('/graph', (request,response) => {
//not related to LA, this the data I am sending to LA
    var token = mods.readToken('rt').access_token
    mods.apiCall(token,'https://graph.microsoft.com/v1.0/me?$select=displayName,givenName,onPremisesSamAccountName', (data) => {
    console.log('reading graph', data)
//LA object
    jsonObject = {
        WAFCaller:request.hostname,
        identity:data.displayName,
        datasource:request.ip
    }
    console.log(jsonObject)
//send data to LA
        logs.PushToAzureLogs(jsonObject,laws,(data)=> {
            console.log(data)
        })
//return original response
    response.send(data)
    })
})

Once the data is sent, it will take about 5-10 minutes, for the first entries to be popping up

If /when you attach the Log Analytics workspace to Sentinel, you can then use it create your own hunting queries, and combine the data you have with TI-feeds etc

Happy hunting!

Add sAMAccountName to Azure AD Access Token (JWT) with Claims Mapping Policy (and avoiding AADSTS50146)

With the possibilities available (and quite many of blogs) regarding the subject, I cant blame anyone for wondering whats the right way to do this. At least I can present one way that worked for me

Here are the total ways to do it (1. obviously not the JWT token)

  1. With SAML federations you have full claims selection in GUI
  2. Populate optional claims to the API in app registration manifest, given you’ve updated the schema for the particular app
  3. Create custom Claims Policy, to choose emitted claims (The option we’re exploring here)
  4. Query the directory extension claims from Microsoft Graph API appended in to the directory schema extension app* that Graph API can call

Please note, for sAMAccountName we’re not using the approach where we add directory extensions to Graph API queryable application = NO DIRECTORY EXTENSION SYNC IN AAD CONNECT NEEDED


Checklist for using Claims Mapping Policy

Pre: Have Client application, and web API ready before proceeding

#Example App to Add the Claims 
AzureADPreview\Connect-AzureAD
$Definition = [ordered]@{
    "ClaimsMappingPolicy" = [ordered]@{
        "Version" = 1
        "IncludeBasicClaimSet" = $true
        "ClaimsSchema" = @(
            [ordered]@{
                "Source" = "user"
                "ID" = "onpremisessamaccountname"
                "JwtClaimType" = "onpremisessamaccountname"
            }
        )
    }
}
$pol =  New-AzureADPolicy -Definition ($definition | ConvertTo-Json -Depth 3) -DisplayName ("Policy_" + ([System.Guid]::NewGuid().guid) + "_" + $template.Values.claimsschema.JwtClaimType) -Type "ClaimsMappingPolicy" 
 
$entApp =  New-AzureADApplication -DisplayName  ("DemoApp_" + $template.Values.claimsschema.JwtClaimType)
$spnob =  New-AzureADServicePrincipal -DisplayName $entApp.DisplayName -AppId $entApp.AppId 
Add-AzureADServicePrincipalPolicy -Id $spnob.ObjectId -RefObjectId $pol.Id 
#From the GUI change the Identifier and acceptMappedClaims value (From the legacy experience)

  • Generally: The app that will emit the claims is not the one you use as the clientID (Client subscribing to the Audience)
    • Essentially you should create un-trusted client with clientID, and then add under Api permissions the audience/resource you’re using
  • Ensure that SPN has IdentifierURI that matches registered custom domain in the tenant
    • The reasoning is vaguely explained here & here
      • Whatever research work the feedback senders did, it sure looked in depth 🙂
  • Update the app manifest to Accept Mapped Claims
    • (Works now in the new experience too) Do this in the legacy experience, the new experience at least in my tenant didn’t support updating this particular value
”Insufficient privileges to complete the operation”

if mapped claims are not accepted in manifest, and pre-requisites are not satisfied you might get this error

”AADSTS50146: This application is required to be configured with an application-specific signing key. It is either not configured with one, or the key has expired or is not yet valid. Please contact the application’s administrator.”

  • Below is example for the Manifest changes (AcceptMappedClaims, and verified domain matching URI)
     "id": "901e4433-88a9-4f76-84ca-ddb4ceac8703",
    "acceptMappedClaims": true,
    "accessTokenAcceptedVersion": null,
    "addIns": [],
    "allowPublicClient": null,
    "appId": "9bcda514-7e6a-4702-9a0a-735dfdf248fd",
    "appRoles": [],
    "oauth2AllowUrlPathMatching": false,
    "createdDateTime": "2019-06-05T17:37:58Z",
    "groupMembershipClaims": null,
    "identifierUris": [
        "https://samajwt.dewi.red"
    ],

Testing

If you’re planning to use non-verified domain based identifier

”AADSTS501461: AcceptMappedClaims is only supported for a token audience matching the application GUID or an audience within the tenant’s verified domains.

References

https://github.com/MicrosoftDocs/azure-docs/issues/5394

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/develop/active-directory-claims-mapping.md

https://github.com/Azure-Samples/active-directory-dotnet-daemon-certificate-credential#create-a-self-signed-certificate

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/develop/v2-protocols-oidc.md#fetch-the-openid-connect-metadata-document

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/active-directory/develop/active-directory-claims-mapping.md#example-create-and-assign-a-policy-to-include-the-employeeid-and-tenantcountry-as-claims-in-tokens-issued-to-a-service-principal

Decode JWT access and id tokens via PowerShell