New Office 365 Network Performance Analytics from Office365Mon

One of the most commonly requested features we’ve had from our customers at Office365Mon has been something to help them better understand network performance with their Office 365 tenants.  Today we’re happy to announce that we have released a set of services, tools and reports to do just that.  You can breakdown the performance of your Office 365 tenant between what’s happening in the tenant itself versus the network transport of your requests.  You can compare what the network performance is like to your Office 365 tenant from the different geographies where you have users.  You can also set notification thresholds, so you can be alerted when network performance at any location becomes unsatisfactory.

We rolled out the first part of this suite of services in June, with support for monitoring tenant Health Scores and farm processing times for SharePoint Online and One Drive for Business.  We’ve extended that model now to include the following:

  • Set notification thresholds for network performance times from cloud-based probes
  • Set notification thresholds for network performance times from every geographic location where you install our Distributed Probes and Diagnostics agent
  • Capture network performance metrics for all cloud-based as well as Distributed Probes and Diagnostics agents
  • Monitor One Drive for Business – as well as SharePoint Online and Exchange Online – with the new version of our Distributed Probes and Diagnostics agent

With these new features turned on, we’ll monitor and capture tenant- and network-specific performance metrics with each health probe we issue for SharePoint Online and One Drive for Business.  This, coupled with our other performance and availability data, gives you a very clear picture of when and where problems or potential problems may be happening throughout your organization.  That helps you understand where bottlenecks are happening as they occur and gives you the insight needed to focus your resources where they can best be utilized.

Here’s a real-world example that occurred during an early release of these services.  Notifications were sent out for a tenant from multiple Distributed Probe agents in different geographies because the roundtrip time for the health probe exceeded the limits that had been set.  However, the network performance alerts did not fire, which meant that network transportation time was within limits even though the overall roundtrip time was not.  Looking at the reports in our Advanced Report gallery, you could also see that during this same time the tenant processing times had spiked.  In addition to that, the Health Scores of the tenants also were spiking (the higher the Health Score, the less healthy the tenant is considered to be, ironically).  By looking at these data points we could tell that in this instance the performance issue was all as a result of a bottleneck in the Office 365 tenant itself, not in the network.  That meant that there was no need for us to spend resources trying to evaluate our network, run diagnostics there, try to identify bottlenecks in it, etc. – in short, saving us a whole bunch of time and money.

In addition to the notification aspect, once these tools are up and running for your Office365Mon subscription, there are a number of interesting reports to look at.  One of the really interesting ones compares the network performance at different geographies, at different times of day:


Here we can see the network performance from each of these locations, in 15-minute increments of the day.  The report is based on a 30-day rolling set of data, so you can see at a larger view how each location is doing relative to one another.  Also, you can also spot at a glance if you commonly have certain times of days, even if it only happens in certain locations, where your network performance is problematic.

Another similar slice of this data is to view it by day of the week:


Again, many customers complain that on certain days of the week performance is worse than others.  These reports give you an ability to identify just such situations, also on a geo-by-geo basis.

Breaking down tenant processing times versus network transportation times is also done quite easily with other new reports we’ve released.  For example, here is one of the reports mentioned above, that lets you get the tenant processing times as well as Health Scores, broken down by SharePoint Online versus One Drive for Business.  This is also an important difference to highlight because we have seen performance differences between the two services:


If you’re concerned about performance differences between SharePoint Online and One Drive for Business, we can also break that down further yet and show you what the performance is like for each, broken down by each geography where you have installed the Distributed Probes and Diagnostics agent:


Finally, you can also get a comparison of the amount of time requests are spending on tenant processing, versus network transportation – and, you can get that also broken down by each geography where health probes are being issued:


One of the other incredibly useful things about all of our charts, is that you can simply click on items in the legend to remove them, or add them back in, to any chart.  So if you want to do some drill down and compare 2 or 3 or however many different geos to each other you can do that.  Or compare a particular service across locations.  The ways in which you can slice, dice and analyze this data is extensive and incredibly useful as you analyze and work to maximize your connectivity with Office 365.


It’s Turned On – Go Use It!

The core health score, tenant performance, and network performance for our cloud-based probes are turned on for everyone, so go use them!  Notification thresholds for performance issues with tenant or network performance can be managed by going to the Configure Office365Mon page on our site at  The new network performance and One Drive for Business monitoring features of the Distributed Probes and Diagnostics agent is also available immediately for all customers that are licensed to use this feature.  It does require you to download the latest version of our agent from the Configure Distributed Probes page on our site.  If you already have an agent installed, you’ll need to uninstall it and then install the latest version.


Stay in the Know, Stay in Control

These services, tools, and reports all fit in perfectly with our mantra of Stay in the Know, Stay in Control.  You’ll have amazing insights into the breakdown of performance across geographies with your Office 365 tenant, ensuring that you can deliver top quality service to your users.  I hope you’ll try it out and, as always, send any feedback or other ideas to us.  We love to hear them and incorporate as many of them as possible into our service.

From Sunny Phoenix,



Doing The Mash with Excel Web App and SkyDrive

Hey folks, I just wanted to pass along some really very cool new features of the Excel Web App and SkyDrive.  There’s a new site up called Excel Mash Up ( that has some really interesting ways that you can upload Excel workbooks to SkyDrive and then embed them in your web site.  It also includes a bunch of slick programmability features using javascript and a nice programming model.  There are a bunch of demonstrations up on the site that you can do live, and each one is complete with source code as well as pointers to the documentation where you can get more info.  It’s really well put together, and even includes a pretty awesome demo that incorporates data from one of these Excel workbooks with Bing Maps for a little chart and map combination.  Very cool stuff!  I really encourage you to take a look at what can be done, and the fact that all of this is possible through a free SkyDrive account is just amazing.

P.S.  If I don’t talk to you all before Christmas Day, I hope everyone has a swell holiday!!  I’m already starting to plot out nuggets of information on the next version of SharePoint that I hope to being sharing with you all as soon as possible.

OneDrive for Business Browser Using o365 APIs

I’ve been spending a lot of time recently talking to different partners and customers about building applications using the Apps for Office and Apps for SharePoint platforms.   One of the questions I seem to get most frequently though has been about one of the new additions to our development platform, which is the o365 APIs and specifically OneDrive.  As I’ve been trying to field the various requests about how to use the o365 APIs for OneDrive, I decided that probably the best solution would be just to write a little browser application myself that I demonstrates using the basic features of the API and then leaving it up here for all you to download and deconstruct to your heart’s content.  So…guess what I did over my Christmas break??  Yeah, that’s why I called it Christmas “break” and not “vacation”.  😉

The other thing that I felt has been somewhat lacking from a documentation perspective is how to use ADAL generally and the o365 APIs specifically from a web app, so I decided to use that as the starting point for this application.   The result is an ASP.NET MVC app that shows off the process for getting an access token that you can use with the o365 APIs and “managing” it.  Once we have our access token there are a variety of different ways in which we can use it, and different options for working with OneDrive.  I kind of run the gamut in the sample application – from storing and retrieving an access token in Azure table storage to passing it along as a parameter in a GET and/or POST request.  In working with the o365 APIs I show a few different options:  getting the whole set of content at one time on the server side and then rendering it on the client, to building it one piece at a time using jQuery with an MVC controller and partial view, to using a Web API controller.  Also it does more than just list the contents of my OneDrive site – you can also create and delete folders, and upload and delete files.  I also added some code in the controllers to download files for completeness, but didn’t really do anything with it in my browser (because you can just click on the link I render for the file and download it that way).

Here’s what it looks like when I actually go to my OneDrive site:


Here’s what my browser looks like when all of the content has been retrieved on the server side and rendered all at once on the client:



Here’s what it looks like in the view where I retrieve everything client side (and also where I do the other file management tasks):


Create Your Applications

To begin, before you ever start writing any code, the first thing you’re going to need to do is to create two applications in your Windows Azure Active Directory (AAD) tenant – the first one is going to be used to authenticate users into the MVC application; the second one is going to be used to access OneDrive contents through the o365 APIs.  There are multiple options for both of these applications. 

Create Application for MVC Authentication

If you’re using Visual Studio 2013 then when you create a new MVC application you have the option to configure it use AAD for authentication.  The process in pictures looks like this: 



Alternatively you can just go into the Azure management portal and create the application manually.  This is what I did for two reasons:  1) you need to go there anyways to get the client ID and other information necessary to configure your application and 2) starting the project using the authentication wizard in Visual Studio makes it virtually impossible (or very difficult) to configure it such that you can use the same AAD authentication process to secure both your MVC and Web API controllers.  So if you decide to create the application manually you just need to grant the application permission to “Enable sign-on and read users’ profiles”.  Here’s what my application looks like in AAD:


Create Application to Access OneDrive Content

To create the second application for accessing OneDrive content, you again have a couple of choices:  1) you can install Update 4 or later for Visual Studio 2013 and select the Add Connected Service menu option when you right click on your project in Visual Studio or 2) you can create it manually.  I think you get how to add it manually at this point; if you want to configure it using Visual Studio then look at the Add a New Application section of this Share-n-Dipity post:   My completed application looks like this; you can see that I just granted the application the right to read, edit and delete users’ files:

Now that we have all the application security aspects defined we can start writing code.  That being said…I’m going to preface this by saying I’m not going to do a super deep dive on all of the code in the project in this blog post.  The reason for that is because a) it’s somewhat complicated, b) there’s a lot of it, and c) I’m attaching the entire project’s source code to this post.  I’ll focus on conceptually what was done and why (with appropriate detail, don’t worry), but I won’t be covering every line of code like I do in some posts.  You’re going to have to be a little adventurous and download the project and open it up yourself.  Trust me when I say that it will be way more valuable than reading a 50 page whitepaper / blog post on using the o365 APIs.  As it stands now Word is telling me that I’m already on page 10.  Blech.

Working with Access Tokens in MVC

There are a few things worth noting as it relates to AAD, ADAL, access tokens and o365 APIs.  The first and most important thing to remember is the every resource needs its own access token.  I can’t emphasize this enough…it’s easy to get lost quickly managing your access tokens.  You’ll probably start out getting an access token for the Discovery Service to get the Url for a user’s OneDrive site, and then not understand why you can’t read someone’s OneDrive files using that same access token.  Well it’s because the Discovery Service and the OneDrive service are two different resources so you need to get separate access tokens for each.  Now you may ask, well, I though ADAL supported multi-resource refresh tokens?  Yes this is true, but that just means that you don’t have to prompt a user each time you want to access a different resource – however, you still have to go get an access token for that resource and manage it in some way.

The process becomes more complicated in a web application, which is why I wanted to take the opportunity to cover that here as well.  I often do my proof of concept apps using winforms or a console app and it is genuinely easier to work through the goo that way.  But a little harsh reality is always good to make sure we understand what’s required and how we can make things easier.  The biggest difference is that when you are doing authentication using ADAL from a web app you need to use a client secret.  This is unlike almost every other example you’ll see posted out there, which is probably why there have been so many questions about it.  In a nutshell, you need to get an authorization code to access the resource, and then you can use a client ID and password (i.e. secret) to get an AuthenticationResult, which contains an access token and refresh token.  Once you have the tokens then you really want to persist them somewhere, because otherwise you’ll be pestering your users non-stop to authenticate back to AAD to get a token.  So given that background, here is the basic pattern that I used to get an access token and refresh token for accessing these resources:

  • Query the database to see if I have an access token for the current user and resource.  In my solution I used Azure table storage to persist the tokens.

  • If there is an access token:
    • Is it valid still?

      • Yes – grab the access token

      • No – ask for a new access token using the refresh token

        • Did it return a new access token? 

          • Yes – grab the access token and update the database with the new access token and token expiration time

          • No – the refresh token has expired too so I need to start over and request a code again that I can use to get another AuthenticationResult.

  • If there is NOT an access token:
    • Ask for a code that I can use to get an AuthenticationResult

  • If I have to get a code then I:

    • Use it to get an AuthenticationResult using the AcquireTokenByAuthorizationCode method of the AuthenticationContext class; this is where I’ll pass in the client ID and password.

    • Store the access and refresh token in the database.

I built some helper classes for working with Azure so it simplifies my code somewhat, but here’s what it looks like:


//create an instance of the Azure storage helper

StorageHelper sh = new StorageHelper();


//look first for the Discovery resource token

OneDriveRequest odr = GetOneDriveToken(ref upn, DiscoveryResourceId);


//see if we have a record, but the token is invalid; if that’s the case then we 

//need to ask for a new token using the refresh token

if (

   (!sh.IsTokenValid(odr)) &&

   (odr != null)



       //try and get a new access token with the refresh token;

       //if we’re successful then go to the next step; otherwise

       //the refresh token is likely expired so we need to get

       //another code that can be used to get a new access token

       if (GetTokenWithRefreshToken(odr.RefreshToken, upn,

          DiscoveryResourceId, “http”) != null)

          return RedirectToAction(“ProcessCode”, routeVals);


   else if ((odr != null) && sh.IsTokenValid(odr))  //redirect cuz we have a valid token

       return RedirectToAction(“ProcessCode”, routeVals);


//if we got here, it means we didn’t have an access or

//refresh token we could use


//create an OAuth2 request, using the web app as the client;

authorizationRequest = String.Format(






       this.Request.Url.GetLeftPart(UriPartial.Authority).ToString() +



   RequestType.Discovery.ToString() + “;” +

   HttpUtility.UrlEncode(DiscoveryResourceId) + “;http”));


//return a redirect response

return new RedirectResult(authorizationRequest);

So in short, I’m looking for a valid access token – if I have one or I can get one from my refresh token then I’m going to continue onto the next step, which I do by redirecting to another action in my controller.  If I don’t have a valid token then I’m going to redirect the user’s browser to the AAD oauth authorization page where they can consent to using my application.  When they do that, AAD will look at the redirect_uri in the query string and send the browser back there when it’s finished; the query string will contain the code I can use to get the access token.

The other thing that may not be obvious is that I also include some information in the “state” query string.  That’s because I use the same controller action to process code requests for all resources.  Again, remember that every resource requires its own access token, so since I’m using both the Discovery and OneDrive services, that means I’ll be acquiring two access tokens.  That’s actually how I store it in the database – the user’s UPN and the resource create the key for each row in the table.  So I can ask the database for the Discovery access token for and know that it will be separate from the access token for the OneDrive service for 

All roads in that first controller method lead to my ProcessCode action.  That code can get complicated rather quickly if you’re just trying to grok through it in blog post so again, I’ll just walk through it more from a conceptual level.  To begin with, my method signature for the method looks like this:


public async Task<ActionResult> ProcessCode(string code, string error, string error_description, string resource, string state)


Since I’m redirecting to this code whether I’ve asked for an access code or not, the first thing I do is check for whether the “code” parameter is null or empty.  If it’s not, then I know that we’re responding to a request for a code that I can use to get an AuthenticationResult.  What I’ll do then is this:

  • Get the client ID and password out of my web.config file.  I’m going to use the client ID and password for the o365 AAD application, not the MVC AAD application.

  • I get a new AuthenticationResult by calling the AcquireTokenByAuthorizationCode on the AuthenticationContext.

  • I store the access and refresh token from my AuthenticationResult in the database.

If the “code” parameter is null or empty then I just query the database for the access token.  Since I’ve passed the resource I’m trying to access in the “state” parameter when I call this action, I can extract that out now so I know what data to get from the database. 

Now that I’ve gone through that little bit of gymnastics, I’ve got my pattern in place that I can use for the rest of the app.  At a high level it’s basically going to go like this:

  • Does the “state” parameter indicate that I want to access the Discovery service?

    • Yes – I’m going to create a new DiscoveryClient and use the access token I got above.  After I have that I’m going to ask for information about the “MyFiles” capability.  That will return two important pieces of information for me:  a resource ID and an endpoint URI.  The resource ID is what I’ll need to use to get an access token to access the “MyFiles” capability; the endpoint URI is where I’ll send my requests for data.

      • Once I have the ID and URI I query the database to see if I have an access token for it.  It then follows the same exact pattern I described above – does it exist and is it still valid?  If it exists but isn’t valid then try using the refresh token to get a new access token.  If that works, or if it was already valid, then I have an access token I can use with the “MyFiles” capability (i.e. OneDrive) and I’m going to redirect to this action all over again, but this time I’m going to configure the “state” parameter to say I want to access the OneDrive service.  If I don’t have a valid access token for OneDrive, then I need to get a code for that resource that I can turn into an access token.  To do that, I’m going to use virtually the same exact code I showed above, I’m just going to use the resource ID for the “MyFiles” capability instead of the Discovery Service resource ID.  I’ll use the same redirect_uri so that it comes back to this same exact controller method and we start this little dance all over again.

    • No – I’m not trying to use the Discovery Service, I’m trying to use the OneDrive service (a.k.a. the “MyFiles” capability).  Based on the code I just described above, at this point I know I have a valid access token for the OneDrive service.  What I’m doing initially in my scenario is just getting a list of all the content in my OneDrive site, so I’ll create a new SharePointClient using my access token.  I ask for the list of files, which by default just returns the root folder (“Shared with Everyone”).  With that, I’ll call a method I created that gets all of the details about the folder, along with all of the folders and files it contains.  In this case for each folder I find I’ll call this same method recursively so I end up with all of the folders and all of the files in my OneDrive site.  When I’m done getting all of my content I plug it into the TempData dictionary that MVC provides and I redirect one more time back to my Home controller.  From there it returns a view that uses the OneDrive content in the TempData dictionary and the view emits the result on the page. 

That, in one ginormous nutshell, is how the first page gets loaded.

Working with Content

Now that the whole issue of dealing with access tokens is explained, most of the rest of the details are downhill.  For a variety of reasons, I used both the Microsoft Office 365 Discovery and My Files Library clients, as well as the o365 REST APIs to work with the content in the OneDrive site itself.  The Discovery client is really important to get started because it provides the two things you need to query OneDrive:  the resource identifier for the OneDrive site, and the actual endpoint you use to work with content in that OneDrive site.  In the Discovery client, you get these attributes from a CapabilityDiscoveryResult, which has ServiceResourceId and ServiceEndpointUri properties for this purpose.

With those valuable pieces of information at hand, I start out by getting the root folder of the OneDrive site by using an instance of the SharePoint client that comes with the Microsoft Office o365 libraries.  It was really designed to get the access token asynchronously as part of the client creation process, but my scenario doesn’t fit into that little box because I already have an access token.  Instead then, I create my client like this:

var odClient = new SharePointClient(new Uri(EndpointUri), () =>

Task.Run(() =>


return accessToken;


Using that I can get the root folder like this:

IPagedCollection<IItem> items = await odClient.Files.ExecuteAsync();

That collection is going to return the “Shared with Everyone” folder that is common at the root of every OneDrive site.  In theory you could also have additional folders and what not at the root, but it is so rare that I chose not to focus on that for my scenario.  Really all I need is the first item the collection because it has the ID for the item, and that ID is what I use to key all of my subsequent requests for data.  Simply stated, I took a look at the data that’s returned by a REST call for content and I mapped that into my own class, which I call a OneDriveItem.  In addition to all of the details about an item (which can be either a folder or a file), each OneDriveItem also has a List<OneDriveItem> of Folders and Files.  The content schema is the same for both, so this is why OneDriveItem is used for everything in my code sample.

You can probably imagine the rest of how that scenario plays out (or just download the code to look at it) – I have my item ID, so I a) get all of the details for that item, b) get all of the files contained in that item and c) get all of the folders contained in that item.  For each folder I find, I can optionally call the same method recursively to get all of the details, files and folders in it.  That’s really the main difference between the view that shows the entire contents of the OneDrive site at load time, and the view that loads it one folder at a time; when I show everything I call the method recursively and when I show it one folder at a time I do not.

Working with Content from Client Script

At this point I’ve explained how I get the access token and how I retrieve contents on the server side.  Working with data from client script involved using jQuery and calling either a standard MVC controller (when I want a chunk of HTML back) or a Web API controller (when I want to perform some action, like adding or deleting a folder).  I chose to write the access token to a hidden field on the page and send it back to the controller either in a query string or HTML form when I requested data.  Since everything is working over SSL I’m okay with that.  If you are not, another alternative could be to store some key to your database where you keep the access token, send that up, and look it up.  There are a few minor nuances in terms of what it means from a security standpoint, but I’ll leave it up to you and your requirements to dictate the appropriate implementation. 

That aside, the implementation for this sample was fairly straightforward.  When I needed HTML I called an MVC controller action because I can return a view from that, which returns HTML.  In fact I ended up using the same exact partial view to render both the view where I showed the entire contents all at once, and the view where I build the contents one folder at a time.

In addition to rendering contents, on the client view you can also add folders, delete folders, upload files, and delete files.  I also wrote methods in the controllers to download files but didn’t actually implement then on the client script view.  The reason for that is because I rendered each file as a hyperlink to that file, so you can just click on the file to open it up.  I added the code in the controller though in case you are working with some other modality (like a console app, or winforms app, or whatever).  Fortunately both the REST endpoint and Office 365 libraries make those other functions pretty straightforward to do.  Here’s how to create a new folder for example; the parameter NewFolderRequest is just a custom class I created that contains all of the HTML form values that were submitted with the request.  The return value from creating a folder is the same set of JSON that you get when you query the folder details, so I’m able to deserialize that back into a OneDriveItem again so it plugs into my various rendering methods really well.

public static async Task<OneDriveItem> CreateFolder(NewFolderRequest nfr)


   //PUT {base-url}/Files/{parent-id}/children/{folder-name}




   OneDriveItem result = null;



       //create the Url for the new folder

       string requestUrl = nfr.endpointUri + “/files/” + nfr.parentFolder +

          “/children/” + nfr.newFolder;


       HttpClient hc = new HttpClient();


       //add the header with the access token

       hc.DefaultRequestHeaders.Authorization = new


          “Bearer”, nfr.accessToken);


       //make the put request

       HttpResponseMessage hrm =

          (await hc.PutAsync(requestUrl, new StringContent(“”)));


       if (hrm.IsSuccessStatusCode)


          string data = await hrm.Content.ReadAsStringAsync();

          result = OneDriveItem.GetFromJSON(data);



   catch (Exception ex)




   return result;


The gist is pretty simple really – I create a Url in the format that the o365 REST endpoint requires for creating a new folder.  I create a new HttpClient and add the authorization header using the access token I have.  Then I make the PUT request to create the new folder.  If it works, I take the JSON I get back and deserialize it into a OneDriveItem that I can then send back so it can be rendered by the client script into the view of the contents I’ve built in the page.  The other commands for adding and deleting items are pretty similar, so you can take a look at the code included with this post for specific details.


You have what should be a nice pattern now for working with o365 access tokens in an MVC app.  In addition the solution includes working examples of the most common tasks when using the o365 APIs for OneDrive.  It also demonstrates using both the o365 libraries as well as the REST endpoints for working with data in OneDrive sites.  Using this sample as a reference should provide you with enough examples to get any of your o365 API applications up and running for OneDrive sites.

You can download the attachment here:

OneDriveBrowser and o365 APIs with A Custom Persistent Store for ADAL Token Cache

In this post I’m going to briefly cover a custom token cache that I wrote for use with ADAL.  The implementation itself is pretty straightforward from a coding perspective so I will just highlight a few of the basics.  What has been less clear up to this point is the “right way” to use it in your apps to make sure that you’re cached tokens are being used correctly, so I’ll cover that as well.  The token cache code itself is based on the general outline of the capabilities as Vittorio described in this blog post:  The sample application I’m using is an updated version of the OneDriveBrowser built on the o365 APIs that I originally covered here:

Coding Implementation

The code itself is pretty straightforward.  To begin with I created a new project and added the ADAL v2 Nuget package.  I then created a new class and had it inherit from Microsoft.IdentityModel.Clients.ActiveDirectory.TokenCache.  In my implementation I’m storing the tokens in Azure table storage.  I’m using it in an ASP.NET MVC application, and I need to track the tokens on a per user basis.  In my case the main customizations then that I need to account for are a) tracking tokens for individual users and b) managing access to Azure storage.  I provide hooks to both of those things through my constructor.

For tracking individual users I’ve used the user UPN.  I’ve configured my MVC app so that it’s secured with Azure Active Directory.  Since a user has to authenticate before they even get into my web app, I can be assured that they have a UPN.  For managing access to Azure storage I left in a couple of options.  I have the storage name and key included in one of the helper classes in my custom TokenCache project.  That way if you just want to take the code and compile it yourself, you can replace those values with the storage name and key that you are using.  As another option, I created a second constructor that takes the storage name and key in addition to UPN so you can set your credentials on the fly.  Here’s an example of what the two constructors look like (my class name is TokenStore):


public TokenStore(string UPN)

public TokenStore(string UPN, string storageName, string storagePassword)


Inside the class I’ve implemented BeforeAccessNotification, AfterAccessNotification, and DeleteItem.  The notifications are hooked up from my constructors with code like this:

this.BeforeAccess = BeforeAccessNotification;

this.AfterAccess = AfterAccessNotification;


DeleteItem doesn’t have an event handler to hook to so in that case I just have to override it, invoke the base class method first, then do my stuff.  Again, the actual code for all of this is pretty simple.  I created a helper class to manage all of the Azure stuff, so in pseudo code terms my implementations look like this:

  • BeforeAccess:  use the UPN that was passed into the constructor and the resource ID that is being requested.  Look in Azure storage to see if I have a token in there for that combination; if so, then get the byte array out of table storage and call the Deserialize method that comes from the base class.

  • AfterAccess:  look to see if the cache was updated (for example, if the access token was refreshed).  If so, save the byte array back to table storage and set the HasStateChanged property of my token cache to false so that it doesn’t keep trying to overwrite the same token.

  • DeleteItem:  call the base class so the deleted token is removed from the in-memory cache, then delete the same item out of Azure table storage.


That’s basically it.  All of the source code is included with this post so you can review it and use it as you see fit.

App Implementation

As I mentioned above the trickier part in using ADAL with my custom token cache was to ensure that I was using ADAL “the right way” in my app.  Arguably this really isn’t even specific to my custom token cache, it’s the way that I should be using ADAL anywhere, regardless of what caching mechanism is being used for it.  As I was describing before I’m using this with an ASP.NET app; that means that in order to get an access token I need to use the acquire token method in ADAL that uses an authorization code and client credential (so I can pass in the app secret).  What’s important in this process as well though (according to the Vittorio blog I referenced at the beginning) is that you use the user’s actual login Authority when you create the AuthenticationContext, versus just using common (i.e.   The last part of the flow is that when a user is ready to get an AuthenticationResult in subsequent requests, you should no longer ask for it using the authorization code – instead you should try one of the AcquireTokenSilent overloads.

So with all of this information in hand, here’s how the process works of getting an AuthenticationResult (which has the access token):

  • I have a table in Azure storage where I keep track of each user’s authentication parameters – UPN, resource ID, and login Authority.  When the user goes into a controller action that needs an access token I look in that table to see if I have a record matching the UPN and resource they are requesting.

  • The first time a user logs in, there won’t be any record so I redirect out to Azure to get an authorization code.

  • When Azure redirects to my controller, I see that there is an authorization code so I:
    • Create a new instance of AuthenticationContext, using my custom token cache:  authContext = new AuthenticationContext(“;, new AdalAzureCache.TokenStore(upn));

    • Request an AuthenticationResult:  result = authContext.AcquireTokenByAuthorizationCode(AuthCode, new Uri(Request.Url.GetLeftPart(UriPartial.Path)), credential);

    • Store the user’s UPN, the ID of the resource they were requesting, and the login Authority in my custom storage for this app.  To be clear – this is storage just for this app – it is different from the storage that my custom token cache class uses.  The way I get the Authority for the user is after I’ve acquired an AuthenticationResult for them, the AuthenticationContext.Authority property has their specific Authority.

  • The next time the user needs an AuthenticationResult, I look in storage for my app and I find a record with the user’s UPN, resource and Authority.  In that case I:

    • Create a new instance of the AuthenticationContext; same as before but this time I use the user’s Authority:  authContext = new AuthenticationContext(“user’s authority here”, new AdalAzureCache.TokenStore(upn));

    • Request an AuthenticationResult silently:  result = authContext.AcquireTokenSilent(ResourceUri, credential, UserIdentifier.AnyUser);  This particular overload is one that I just found through trial and error to work with my token cache.

That’s basically it.  When the request to acquire the token silently is called, I see my custom token cache getting invoked.  It finds the token cache in Azure table storage for the user and resource and then sends it out for ADAL to use.  Everything just plugs in and out like a well-oiled machine and works. 

All of the source code for the OneDriveBrowser application that has been updated to use the new App Implementation described above and has been included, as well as the custom ADAL token cache that uses Azure table storage.  As with the previous OneDriveBrowser post, you will need to go into the Globals.cs in the AdalAzureCache and CloudHelper projects and plug in your Azure storage name and key (exception:  if you don’t want to compile it in AdalAzureCache you can pass the values in through the constructor as explained previously).  In the web.config file for the OneDriveBrowser MVC application you will need to go plug in your Azure application values. Hope you find it useful.

You can download the file here: