If you are working with Azure, chances are that you’ve at least indirectly consumed Azure Blob Storage at some point. Azure Storage in general is one of the elementary building blocks of almost any Azure service, and in many cases you end up dealing with storage authorization at some point. This is where SAS tokens enter the picture, and what this article is about
General description of SAS tokens from @docs MSFT
A shared access signature (SAS) provides secure delegated access to resources in your storage account without compromising the security of your data. With a SAS, you have granular control over how a client can access your data. You can control what resources the client may access, what permissions they have on those resources, and how long the SAS is valid, among other parameters.
https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview

The approaches provided here include NodeJS samples, but as maybe obvious these approaches are fairly agnostic of the framework. NodeJS is just used to provide samples for similar approaches. This approach works regardless of the runtime/ platform.

- When you use Azure as the platform you gain the benefit of using VNET service endpoints, and managed service identities for app service based and containerized approaches
- Other options exist (private links etc)
While multiple technical approaches for storage access exist based on SAS tokens, two approaches tend to stand out.
- Proxy based
- Proxy processes the authorization and business logic rules and then pipes (proxies) the blob to the requester via SAS link stored in table storage (SAS link could also be created ad-hoc) / use of Table storage by no means is mandatory here, but provides a convenient way to provide references to SAS links
- Even behind proxy it makes sense to use SAS links as it narrows access down for the particular NodeJS function to match requester permissions
- This method also allows comprehensive error handling including retry logics, and different try/catch blocks for transient Azure Storage errors.
- Azure Storage errors, which to be honest are rare to happen, but nonetheless can happen.
- With redirect based the all error handling happens between user client and the storage HTTP services itself
- Proxy based approach allows locking down the storage account in network level to the web application only.
- In this approach only the proxy should be allowed to access the Storage Account from network perspective. Following options are available
- Azure Storage Firewall
- Authorized Azure VNET’s (VNET Service endpoints
- IP address lists
- Private Links (Perhaps a subject for a separate blog)
- Azure Storage Firewall
- Proxy processes the authorization and business logic rules and then pipes (proxies) the blob to the requester via SAS link stored in table storage (SAS link could also be created ad-hoc) / use of Table storage by no means is mandatory here, but provides a convenient way to provide references to SAS links
- Redirect based
- Proxy processes the authorization and business logic rules, and then redirects the requester to blob object via SAS link
- After the SAS link is obtained (by users browser) there is nothing to prevent user sending the link to another device, and use that link there, unless Azure AD SAS Delegation, or per download IP restrictions are set to the link.
- Redirect based might be better if you are concerned about complexity and overheads introduced the proxy based methods (In redirect based Azure Storage accounts HTTP service processes the downloads, and can likely handle a large amount of concurrency)
- Proxy processes the authorization and business logic rules, and then redirects the requester to blob object via SAS link
Both of these options are explored also in @docs msft

- Its worth mentioning that for both these methods/approaches great deal networking and authorization variations exist besides the ones presented here.
Examples
Prerequisites: SDK and depedencies
- Storage SDK is the ’azure-storage’ SDK.
- For Node.JS Web server the legendary ExpressJS
- Node.JS native HTTPS API is used for creating a proxy client to pipe the client connection in the proxy based method
- Important dependencies for both approaches are
"dependencies": { "azure-storage": "^2.10.3", "express": "^4.17.1", "jsonwebtoken": "^8.5.1", "jwk-to-pem": "^2.0.3", "jwks-rsa": "^1.6.0", }

Samples for both approaches
- Samples highlight the use of ExpressJS and native node API’s to achieve either method. Azure Storage code is abstracted into separate functions. Both methods use the same Azure Storage access methods.
Proxy based
Below is example for ExpressJS based app, which has direct function invoked for get verb in route (’/receive)

- App service and storage configuration
- (S1 Plan) for App service
- App Service custom DNS binding with App service managed certificate
- VNET integration with stand-alone vnet
- Storage account v2 with firewall set to authorize selected VNET.s
- Phase 1 authorize the token verified by JWT.verify() must match user entry on req.query.to
- Return authorization error if signed-in user doesn’t match req.query.to
- Phase 2 Query table storage with req.query.to
- Phase 3 proxy SAS link connection
- Pipe if response was ok!
app.get('/receive', (req, res) => {
var proxyClient = require('https')
var usr = (decode(req.cookies.token).email)
console.log(chalk.green((`${req.query.to} with ${usr}`)))
// Phase 1 authorization the token verified by JWT.verify() must match user entry on req.query.to
if (!usr.includes(req.query.to)) {
// Return authorization error if signed-in user doesn't match req.query.to
return res.send(`Authorization failed. Not logged in as recipient ${req.query.to} - Logged in as ${usr} `)
}
// Phase 2 Query table storage with req.query.to
QueryTables(req.query.from, req.query.to, req.query.uid, (error, result, response) => {
var sd = url.parse(response.body.value[0].filename).path
// Phase 3 proxy SAS link connection
proxyClient.get(response.body.value[0].sasLink, (proxyres) => {
console.log(proxyres.statusCode)
//Pipe if response was ok!
if (proxyres.statusCode == 200) {
var content = `attachment; filename=${sd}`
res.setHeader('content-disposition', content)
proxyres.on('data', (chunk) => {}).pipe(res)
} else res.render('failed', {
message: "Link expired, due to this SAS link cannot be verified, Server errorMsg " + response.statusMessage
})
proxyres.on('end', () => console.log('end'))
}).end()
})
})
Redirect based
Redirect based method is fairly simple, and essentially just uses the res.redirect() method of expressJs after authorizing the user

- Phase 1 authorize the token verified by JWT.verify() must match user entry on req.query.to
- Return authorization error if signed-in user doesn’t match req.query.to
- Phase 2 Query table storage with req.query.to
- Phase 3 redirect user SAS link connection
app.get('/redirect', (req, res) => {
console.log('redirecting')
var proxyClient = require('https')
var usr = (decode(req.cookies.token).email)
console.log(chalk.green((`${req.query.to} with ${usr}`)))
// Phase 1 authorization the token verified by JWT.verify() must match user entry on req.query.to
if (!usr.includes(req.query.to)) {
// Return authorization error if signed-in user doesn't match req.query.to
return res.send(`Authorization failed. Not logged in as recipient ${req.query.to} - Logged in as ${usr} `)
}
// Phase 2 Query table storage with req.query.to and redirect user to SASlink
QueryTables(req.query.from, req.query.to, req.query.uid, (error, result, response) => {
res.redirect(response.body.value[0].sasLink)
})
})
Considerations for both approaches
- For redirect method its of utmost importance to keep the SAS-link short lived.
- For proxy method if you store the saslink in table storage ( instead of creating it based on the specifications stored in table storage) you will be more locked to provide longer lifetimes for SAS tokens.
- Essentially you could create the sas link with one-time link (short lived) characteristics when table storage is invoked for link details
Other things:
- Using Azure AD SAS delegation is not directly available for the SDK I am using for NodeJS.
- In most scenarios you can replace public blob access with SAS tokens too, in cases where you have front-end (proxy) being able to facilitate access via creation SAS links
- Checkout the excellent docs.microsoft.com best practices article on using SAS tokens
- Creating SAS links from the SDK this far has required using account name and key connection methods.

Till next time!
Br, Joosua
0 comments on “Deep diver – NodeJS with Azure Web apps and Azure Blob Storage SAS Authorization options”