Integrating NLog with Azure Cloud Service Diagnostics

NLog is a popular open-source logging Framework for .Net applications. In my opinion, it is very flexible and is more feature rich than, say, the builtin .Net tracing Framework. For instance, NLog supports asynchronous logging. It also supports flexible trace formatting an has auto-archiving mechanism that I find very useful.

Some articles have already been discussing how using NLog from an Azure Cloud Service could work. Most of these articles instruct you to make NLog log to the builtin .Net Trace sources and, from there, the default support for Azure diagnostics kicks in and persist the logs to a preconfigured Storage account.

However, this approach has the drawback that any Tracing done in your application will also end up in the persisted logs.

In the project I’m working on, I could not use this approach because I wanted to have multiple distinct trace files. This project implements some kind of a Process Runner and I wanted to capture and persist each process’s STDOUT and STDERR streams in a dedicated trace log.

So the strategy I ended up following consists in implementing the following steps:

  1. Write NLog messages to a dedicated file. One file per logger.
  2. Use NLog auto-archiving feature to periodically archive traces to a particular folder.
  3. Configure Azure diagnostics to persist archived traces to a Storage account.

In practice, this works really well.

The remainder of this post will walk you through how to configure you Cloud Service in order to persist NLog file traces to an Azure Storage account.

Writing NLog traces to local storage

First, you need to configure local storage for your Cloud Service, so that NLog can write traces to the filesystem.

For reference, your Cloud Service’s ServiceDefinition.csdef file is modified like so:

<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="Host" xmlns="" schemaVersion="2014-06.2.4">
  <WorkerRole name="WorkerRole" vmsize="Small">
      <LocalStorage name="logs" cleanOnRoleRecycle="true" />

Next, NLog needs to know where to write the log files. The easiest way I found, was to create an environment variable that refers to the local storage location and use that variable in NLog’s configuration. In order to define an environment variable, you need to manually edit your Cloud Service’s ServiceDefinition.csdef file like so:

<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="Host" xmlns="" schemaVersion="2014-06.2.4">
  <WorkerRole name="WorkerRole" vmsize="Small">
        <Variable name="NLOGDIR">
          <RoleInstanceValue xpath="/RoleEnvironment/CurrentInstance/LocalResources/LocalResource[@name='logs']/@path" />

This modification creates an environment variable named NLOGDIR that points to the location of each instance’s local storage resource directory. Having this done in the Service Definition ensures that the environment variable is created before the role starts, so that NLog can take advantage of it during its configuration.

In NLog’s configuration, use the environment variable as the location of the log files.

<target name="App" xsi:type="File"
  lineEnding="Default" autoFlush="true" keepFileOpen="false" concurrentWrites="true" 
  archiveEvery="Minute" archiveNumbering="sequence" maxArchiveFiles="720"

Persisting NLog traces to an Azure Storage account

In order to persist trace files to an Azure Storage account, you must configure Azure diagnostics to periodically transfer the contents of the trace files from the local storage to a preconfigured Azure Storage account.

First, the Azure Diagnostics Module must be enabled for your Cloud Service. This is usually done by default when creating the Cloud Service project from Visual Studio.

Make sure to configure an appropriate Connection String pointing to an Azure Storage account where trace files will be persisted. During development, of course, the development storage will do.

In theory, you can configure Azure diagnostics declaratively, using the Diagnostics.wadcfg file, stored under your role node in Solution Explorer.

In order for the configuration to take effect, this file must be included, or better linked, to the role project. Make sure to set the “Copy to Output Directory” property of this file to “Copy always” or “Copy if newer”.

The contents of the Diagnostics.wadcfg file must be updated so that the Azure Diagnostics monitor can transfer the contents of the local storage directory to a specified blob container in the configured Storage Account.

<DiagnosticMonitorConfiguration configurationChangePollInterval="PT1M" overallQuotaInMB="4096" xmlns="">
  <DiagnosticInfrastructureLogs />
  <Directories scheduledTransferPeriod="PT1M" >
      <DirectoryConfiguration container="wad-custom-container" directoryQuotaInMB="1024">
        <LocalResource name="logs" relativePath=".\archive" />

Make sure to not exceed the overall quota otherwise the Diagnostics Monitor will crash upon startup. Thanks Jérémie for the time taken to troubleshoot this!

This entry was posted in Tips, Windows Azure and tagged , , . Bookmark the permalink.

3 Responses to Integrating NLog with Azure Cloud Service Diagnostics

  1. What version of the Azure sdk are you using? This all worked great for me until v2.5

  2. Hi Joe,

    It seems you’re right. After upgrading to the newer .wadcfgx format, it seems that the logs are no longer persisted to blob storage. Weird !

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s