Analyzing and Fixing Azure Web Sites with the SCM Virtual Directory

There’s so many things you do every day as part of operating and maintaining your Azure web sites.  They’re a common target for developers because you get 10 free sites with your Azure subscription, and if you know what you’re doing you can spin that up into even more applications by using custom virtual directories as I’ve previously explained here:  https://samlman.wordpress.com/2015/02/28/developing-and-deploying-multiple-sharepoint-2013-apps-to-a-single-azure-web-site/.  That example is specific to using them for SharePoint Apps, but you can follow the same process to use them for standard web apps as well.

Typically, you go through your publishing and management process using two out of the box tools – Visual Studio and the Azure browser management pages.  What happens though when you need to go beyond the simple deploy and configure features of these tools?  Yes, there are third-party tools out there than can help with these scenarios, but many folks don’t realize that there’s also a LOT that you can do with something that ships out of the box in Azure, which is the Kudo Services, or as I’ve called it above, the SCM virtual directory.

The SCM virtual directory is present in every Azure web site.  To access it, you merely insert “scm” between your web name and the host name.  For example, if you have an Azure web site at “contoso.azurewebsites.net”, then you would navigate to “contoso.scm.azurewebsites.net”.  Once you authenticate and get in, you’ll arrive at the home page for what they call the Kudu Services.  In this post I really just wanted to give you an overview of some of the features of the Kudu Services and how to find them, which I kind of just did.  🙂  At the end though I’ll include a link to more comprehensive documentation for Kudu.

Going back to my example, I found out about all of the tools and analysis available with the Kudu Services a few months ago when I was trying to publish an update to an Azure web site.  Try as I might, the deployment kept failing because it said a file in the deployment was being used by another process on the server.  Now of course, I don’t own the “server” in this case, because it’s an Azure server running the IIS service.  So that’s how I started down this path of “how am I gonna fix that” in Azure.  SCM came to the rescue.

To begin with, here’s a screenshot of the Kudu home page:

Kudu1

As you can see right off the bat, you get some basic information about the server and version on the home page.  The power of these features come as you explore some of the other menu options available.  When you hop over to the Environment link, you get a list of the System Info, App Settings, Connection Strings, Environment variables, PATH info, HTTP headers, and the ever popular Server variables.  As a long time ASP.NET developer I will happily admit that there have been many times when I’ve done a silly little enumeration of all of the Server variables when trying to debug some issue.  Now you can find them all ready to go for you, as shown in this screenshot:

Kudu2

Now back to that pesky “file in use” problem I was describing above.  After trying every imaginable hack I could think of back then, I eventually used the “Debug console” in the Kudu Services.   These guys really did a nice job on this and offer both a Command prompt shell as well as a PowerShell prompt.  In my case, I popped open the Command prompt and quickly solved my issue.  Here’s an example:

Kudu3

One of the things that’s cool about this as well is that as I motored around the directory structure with my old school DOS skills, i.e. “cd wwwroot”, the graphical display of the directory structure was kept in sync above the command prompt.  This really worked out magnificently, I had no idea how else I was going to get that issue fixed.

Beyond the tools I’ve shown already, there are several additional tools you will find, oddly enough, under the Tools menu.  Want to get the IIS logs?  No problem, grab the Diagnostic Dump.  You can also get a log stream, a dashboard of web jobs, a set of web hooks, the deployment script for your web site, and open a Support case.

Finally, you can also add Site Extensions to your web site.  There are actually a BUNCH of them that you can choose from.  Here’s the gallery from the Site Extensions menu:

Kudu4

Of course, there’s many more than fit on this single screen shot.  All of the additional functionality and the ease with which you can access it is pretty cool though.  Here’s an example of the Azure Websites Event Viewer.  You can launch it from the Installed items in your gallery and it pops open right in the browser:

Kudu5

So that’s a quick overview of the tools.  I used them some time ago and then when I needed them a couple of months ago I couldn’t remember the virtual directory name.  I Bing’d my brains out unsuccessfully trying to find it, until it hit me when I looked at one of my site deployment scripts – they go to the SCM vdir as well.  Since I had such a hard time finding it I thought I would capture it here and hopefully your favorite search engine will find enough of keywords in this post to help you track it down when you need it.

Finally, for a full set of details around what’s in the Kudu Services, check out their GitHub wiki page at https://github.com/projectkudu/kudu/wiki.

Advertisements

Azure B2B First Look

This is a companion piece to the Azure B2C First Look I published last week. This post is some first thoughts around the new Azure B2B features that were recently announced. The goal of Azure B2B is to reduce the complexity and rigidity of managing business to business relationships and sharing data. The announcement around the B2B feature correctly characterizes this problem as typically being solved either by federating between two companies, or creating a directory (or OU or whatever) in your own internal directory for external users. Both solutions have pros and cons to them. The Azure B2B solution is what I would consider something of a hybrid approach between the two. External users continue to use their one and only external account for authentication so they don’t have to remember multiple account names and passwords. However, they also get added (maybe “linked” is a better way to describe it) within the Azure AD tenant into which they are “added” (a process I’ll explain below). Overall though, even in preview it’s a pretty painless and easy way to connect people up from different organizations.

As with the Azure B2C post, I’m not going to try and explain every bit of the feature and how to configure it – Microsoft has a team of writers tasked with just that. Instead I’ll give you a few links to get started and then share some real world experience now with how it looks and works. So…to get started I would recommend taking a look at this content:

 

Getting Started

The process for getting started is a little peculiar, but there is a certain logic to it. In short, when you want to allow users from other companies to access your resources, you have to create a CSV file with information about them and import that into Azure AD, which you can do via the portal. In fact PowerShell support for doing this is not there at this time, but I’ll be a little surprised if it doesn’t show up by RTM. This is where the second link above will be useful to you – it explains the format of the CSV file you need to create and provides an example of what it looks like.

In my case, for my first test I started with my SamlMan.Com tenant and built out a simple ASP.NET MVC application secured with Azure AD. I got that all working and then invited my Office365Mon.Com account. The CSV for it looked something like this:

Email,DisplayName,InviteAppID,InviteReplyUrl,InviteAppResources,InviteGroupResources,InviteContactUsUrl

speschka@office365mon.com,Office365Mon Steve,FFEAF5BD-528B-4B53-8324-E4A94C1F0F06,,65b11ba2-50f7-4ed2-876d-6a29515904b2,,http://azure.microsoft.com/services/active-directory/

The main things worth pointing out in this simple CSV is the display name I gave – Office365Mon Steve – because that’s how it’s going to show up in the SamlMan.Com AD tenant. Also the GUID that starts with 65b11… is the Client ID of my MVC application that I described above. Once you import the CSV file Azure sends an email to everyone in it, and then each person has to open the email, click a link and accept the invitation. You can see a report in Azure of the status of each person you invited.

Here’s what the report looks like:

When you get your invitation email it looks something like this (some additional customization options are available, but for testing I’m waaaaay to lazy to make it look pretty):

You’ll notice the email is addressed to the display name I put into the CSV file. I’ll show the invite pages in Azure a little later. Now, once that the person has accepted the invitation, as I noted above, they show up in the Azure AD tenant from which the invitation was sent. Here’s what my SamlMan.Com directory looks like after the invite was accepted:

Now with the invitation accepted I can log into the ASP.NET application as speschka@office365mon.com and it works just perfectly. Here’s yet another screenshot of my favorite home page, with all of the current user’s claims enumerated on the page:

You can see again that display name of “Office365Mon Steve” getting carried throughout, so make sure you use something that won’t require constant tinkering.

 

Using it with Other Cloud Applications

One of the things I thought was nice was the integration with other cloud applications. Obviously the images above illustrate integrating with my own custom apps, but what about the apps people frequently use in the cloud, like Office 365? Well, it works pretty slick for that as well. To work through this scenario I flipped things around and used my Office365Mon tenant and invited my account at SamlMan.Com. In the SharePoint spirit of things…I started out by creating a new group in my Azure AD tenant, like this:

Next I went to my SharePoint Online site and added this Azure AD group to the SharePoint Viewers group:

Next I got the object ID of the Azure AD group by running this bit of PoweShell: Get-MsolGroup | fl DisplayName, ObjectId. With that in hand, I created a new CSV file to invite my SamlMan.Com user. The CSV file looked like this:

Email,DisplayName,InviteAppID,InviteReplyUrl,InviteAppResources,InviteGroupResources,InviteContactUsUrl

steve@samlman.com,SamlMan Steve,6787E4D5-0159-4B74-B56D-AA7C36715C0E,,,95419596-ff8a-4db1-c406-ed4ad13fd272,https://www.office365mon.com

The most interesting part of this CSV is the GUID that starts with 95419… It’s the object ID of my SharePoint Readers group. Now when my invited user accepts his invitation he or she will get added to that group. I get the standard email invite that I showed above. When I click on the link to accept it, I am treated to a page in Azure that looks like this (you can also customize the appearance of this, which I chose not to do):

Now after I go through the process to accept the invitation, I went back to the Office365Mon SharePoint Online team site and was able to successfully authenticate and get in with Viewer rights:

Wrap Up

Overall everything with the B2B feature went incredibly smoothly with no hiccups or glitches of any kind; most of the Azure guys really seem to have their act together these days. J Honestly, it’s impressive and I know a lot of teams at Microsoft that could really take a lesson from how engaged the Azure team is with the community, taking feedback, and delivering great products. I think the biggest challenge I’m imagining after my first look is just the long-term user management; for example, when an invited user’s account is deleted from its home Azure AD tenant it would be nice to have it trigger deletes in all of the other AD tenants where it’s been invited. Stuff like that…but I think it will come in time. This is very good start however.

Azure B2C First Look

I’ve spent some time the last couple of days working with the new Azure B2C features and have had the chance to get some things working, as well as note a couple of things that you want to be aware of as you start playing with this yourself. The Azure B2C feature is the first really strong play towards obsoleting Microsoft ACS, or Access Control Services. Prior to B2C, the ACS service was the only Microsoft-hosted identity service that let you create identity provider relationships with social identity providers such as Facebook and Google. It’s an aging platform however, and one that Microsoft has publicly said is on the way out. Since they came out with that statement, folks have been between a bit of a rock and hard spot when wanting to use these social identity providers. You could write your own, use a paid identity hub service like Auth0, or spin “something” up on ACS, knowing full well that its end of life was coming. Even at that, on ACS the connectivity with Google to create new identity provider relationships broke last year so things were not looking good.

The Azure B2C feature now slides into public preview up on Azure. It provides hooks today to let you authenticate using Facebook, Google, Amazon and LinkedIn, as well as any local accounts you have in your new B2C Azure Active Directory. Support for Windows Live accounts ironically is not there yet, but the team has said that it is coming. Along those lines, here’s a couple of links that you should check out to really get started and dig your teeth into this:

While there are some good links to get you started, I wanted to focus on some of the early rough edges so you can hopefully get up and running just a little bit quicker. I’ll preface this by saying the Azure team has really has their stuff together pretty well. There’s videos, sample apps, blog posts, and some good documentation. They also say that they are supporting it via forums on StackOverflow.Com, although so far my one post up there has been getting just a bit lonely. We’ll see. In the meantime, let’s move on to some things to keep in mind.

Sample Code Problems

You may work through the sample applications to build out your own site, which is the logical way to test it. In my case I built a standard VS 2015 ASP.NET MVC application so I followed along that sample. I actually only had to copy one set of files over from the Github sample that they use, which is a folder called PolicyAuthHandlers and the set of classes it contains. When I thought I had it all built out, I ran it and quite quickly got the dreaded yellow screen of death (i.e. an unhandled exception). I spent some time poking through this and realized that there is a problem in the Startup.Auth.cs file, in the OnRedirectToIdentityProvider method. Specifically, this line of code fires when you first authenticate as we try and figure out which Azure B2C policy to use (you can read more about policies in the links above):

OpenIdConnectConfiguration config = await gr.GetConfigurationByPolicyAsync(CancellationToken.None, notification.OwinContext.Authentication.AuthenticationResponseChallenge.Properties.Dictionary[Startup.PolicyKey]);

The problem is that the notification.OwinContext.Authentication.AuthenticationResponseChallenge property is null, so the whole thing blows up. For now, I’m working around it by replacing this parameter (which is just the name of the policy to use) with this other static string you’ll find in the sample code: SignUpPolicyId.

You Get a 404 Response When You Try and Authenticate

After you fix the sample code problem, then you will likely find that when your browser gets redirected to authenticate, it throws a 404 error (or more precisely, Azure throws the exception). After much digging around on that one, I finally realized that they had created a bad query string for me to login with. Specifically, it contained two question marks. In web land, that means it ignores all parameters after that second question mark. The second question mark needs to be an asterisk, and then everything moves right along. Here’s a screenshot of what the URL looks like:

I’ve highlighted the two question marks to make the easier to spot. The way to fix it? So far, each time I authenticate, I wait for the 404 error and then I manually go in and change it in the browser address bar. So I just change the second question mark to an ampersand, hit enter, and away I go.

UPDATE 9/23/15:  Clarky888 was good enough to point out on StackOverflow.Com that this problem is fixed if you update your reference for Microsoft.IdentityModel.Protocol.Extensions to v1.0.2.206221351.  I did this and found that everything works quite well now.

You Get a Login Error About the User Already Exists

One of the problems with my “fix” for the first problem I described in this post is that I set the policy that should be used to one that is only supposed to be fired when you are creating a new account. By “creating a new account”, I mean you are logging into the site for the first time. You can log in with whatever social identity provider you want, and then Azure B2C creates what for now I’ll call a “shadow account” in the Azure B2C AD tenant. For example, after I logged in with all of my different social and local accounts, here’s what my list of users looks like in my Azure B2C AD tenant:

So in the fix to the first problem, I described changing the second question mark to an ampersand. Well, the first query string parameter is the name of your B2C policy that’s going to be used. So in my case, I’m hard coding it in my Startup.Auth.cs file to use my sign “up” policy. However, I also have a different policy for accounts that already exist, called my sign “in” policy. In the address bar picture above you can see the name of my sign up policy is b2c_1_firstdemosignup. So for a user that has already signed up, when I get that 404 error I change the name of the policy as well as the second question mark. I just type over the last two characters so my policy name is b2c_1_firstdemosignin and then change the next character from a question mark to an ampersand, hit enter, and we’re working again. This brings up the “money” shot so to speak – here you can see all of my identity providers alive on my Azure B2C sign in page:

For this post I decided to log in with Amazon…mostly because I’ve never used Amazon as an identity provider before so I thought I would breathe deeply on all the oauth2 goodness. Here’s the login dialog from Amazon after I select it as my identity provider:

Once I sign in then I have a little code that just enumerates the set of claims I have and spits them all out on the page for me. Of course it’s big and ugly, which I think is exactly the kind of work you all have come to expect from me.  🙂

There you go – there’s your end to end login experience with Azure B2C. Pretty cool stuff.

Other Miscellaneous Issues

There are a couple of other little “things” that I noticed that might be worth remembering as well; here they are:

  • It doesn’t recognize the username / password of my account that’s a global admin on my Azure AD B2C tenant.  This is a bit of weird one, but I think I know why.  To back up a little, as I mentioned earlier, you have to create a new Azure AD tenant and when you do, configure it as a B2C tenant.  So you log into the Azure management portal and go through that exercise.  When you do, it adds the account you are currently logged in with as the global admin of the new Azure AD B2C tenant.  However, it doesn’t actually create an account (and corresponding password) for it in that tenant – it just notes that it’s from a different Azure AD tenant.  That’s why I think it won’t let me log in with that account.  Kind of sorta makes sense in a certain Microsoft “by design” kind of way.
  • You can add custom claim types in your B2C tenant.  So for example, I added “favoriteTeam” and “favoriteSport”.  However the claim type that Azure sends to you precedes all of your custom attributes with “extension_”, so “favoriteTeam” becomes “extension_favoriteTeam”.  I mostly add this so you are aware of it when making authorization decisions based on custom claim values.  I don’t know if this will change for RTM or not, but I guess we’ll find out together.
  • It doesn’t seem to pass along the access tokens (if provided) by these social identity providers.  For example, with ACS if you logged in via Facebook it also passed along an access token from Facebook in your claims collection.  That was really sweet because you could then extract out that access token and use it to make calls to the Facebook Graph API.  That functionality is not in there today, not sure if it will be when it’s released.  It was a very nice feature, so hopefully it will make a comeback.

That’s it – there’s my first look at Azure B2C. I’ll be working with it more and adding some other posts about it in the future. As always, you can reach out to me at steve@samlman.com with your own observations, comments and questions…or just post ‘em here so everyone can see them.  😉

Do You Need An Account In Azure Active Directory if Using ADFS?

Today’s topic is a little spin on a question that seems to be coming up more frequently, specifically when folks are using a combination of Azure Active Directory and ADFS. That question is, if I’m using ADFS do I really need to have an account in an Azure Active Directory (AAD) tenant? Well, of course, the answer is it depends.

If all you are going to use AAD for is as some kind of application registry, but all of your applications are hosted elsewhere, you really don’t need an account in AAD. You can just add your domain to the list of domains in your Azure subscription, set up the domain as a federated domain, and configure the ADFS endpoint where folks should get redirected to in order to authenticate.

Where this scenario breaks is if you are securing your applications with AAD. For example, in your AAD tenant you go to the Applications tab, you give it some configuration information, and you add one or more permissions that your application is going to need. In this case, AAD is being used to secure access to the application, and so at a minimum you have to ask for at least the right to log in and read the profile information – this is needed for AAD to be able to send your set of claims off to your application after you authenticate. In that case, there is going to be a consent process that users go through. The first time they log into the application, Azure will present a page to them that describes all of the permissions the application is asking for and the user has to consent to allow the application to have access in order to continue. This is “the consent process” (as simply explained by Steve).

In this case, if there is not a corresponding user object in AAD, the user ends up getting an error page after authenticating into AAD. The error message, which is in the fine print near the bottom of the page, will say something like hey, this external user has to be added to the directory. What this really means is hey, I need to be able to track if this user consents to letting this app get at his information, and if I can’t track his consent then we’re going to have to stop here. The way to fix this issue in this case is just to set up directory synchronization. That will populate AAD with all of the users and groups in the on premise AD, even though users will not be using AAD to authenticate. Once you do that, your error will go away and the whole end to end process will work for you.

So the net – if you are trying to use AAD to secure your application – you need users in AAD too. If not, you don’t need to populate the directory. In most cases people will end up wanting to secure apps in AAD so DirSync will be needed. If you end up going that way, the Azure AD Connect tool is used to do the synchronization for you: https://msdn.microsoft.com/en-us/library/azure/dn832695.aspx.

Fun with Azure Key Vault Services

I was able to spend a little time recently with a new Azure service, the Key Vault service, for some work I was doing. It’s a pretty valuable and not too difficult service that solves an age old problem – where can I securely keep secrets for my applications in Windows Azure. Actually because of the way it’s implanted you really don’t even need to have your application hosted in Azure…but I’m getting a little ahead of myself. Let’s start with the basics. As a precursor to what I have here, I’ll just point out that there’s actually some pretty good documentation on this service available at http://azure.microsoft.com/en-us/services/key-vault/.

Getting Started

Before you start trying to build anything, you really need to have the latest version of the Azure PowerShell cmdlets, as well as the new cmdlets they’ve built for working with Key Vault. You can get the very latest of the Azure PowerShell cmdlets by going here: https://github.com/Azure/azure-powershell/releases. You can get the Key Vault cmdlets by going here: https://gallery.technet.microsoft.com/scriptcenter/Azure-Key-Vault-Powershell-1349b091.

Create a New Vault and Secret(s)

The next step is to crack open your Azure PowerShell window and load up the Key Vault cmdlets. You can do that like this:

Set-ExecutionPolicy Bypass -Scope Process

import-module C:\DirectoryYouExtractedKeyVaultCmdletsTo\KeyVaultManager

I’m just turning off policy to only run signed cmdlets with the first line of code (and just in this process), and then loading up the cmdlets with the next line of code. After that you need to connect to your Azure AD tenant like this:

add-azureaccount

If you have multiple subscriptions and you want to target the specific subscription where you want to create your Key Vault and secrets and keys, then you can do this:

Set-AzureSubscription -SubscriptionId some-guid-here

You’ll see a list of guids for your subscription after you log in with the add-azureaccount cmdlet. Now that you’re logged in and set in your subscription, you can do the first step, which is to create a new vault. The PowerShell for it is pretty easy – just this one line of code:

New-AzureKeyVault -VaultName “SteveDemo” -ResourceGroupName “SteveResources” -Location “West US”

There are a few things worth noting here:

  • The VaultName must be unique amongst ALL vaults in Azure.  It’s just a like an Azure storage account in that sense.  The name will become part of the unique Url you use to address it later.
  • The ResourceGroupName can be whatever you want.  If it doesn’t exist, the cmdlets will create it.
  • The locations are limited right now.  In the US you can create a vault in the West and East but not Central.  Azure should have some documentation somewhere on which locations you can use (i.e. I’m sure they do, I just haven’t gone looking for it yet).

Okay cool – we got a vault created, now we can create keys and secrets. In this case I’m going to use the scenario where I need some kind of database connection string. Assume as well that like in most cases, I have a separate database for dev and production. So what I’m going to do is just create two secrets, one for each. Here’s what that looks like:

$conStrKey = ConvertTo-SecureString ‘myDevConStr’ -AsPlainText -Force

$devSecret = Set-AzureKeyVaultSecret -VaultName ‘SteveDemo’ -Name ‘DevConStr’ -SecretValue $ conStrKey

$ conStrKey = ConvertTo-SecureString ‘myProdConStr’ -AsPlainText -Force

$prodSecret = Set-AzureKeyVaultSecret -VaultName ‘SteveDemo’ -Name ‘ProdConStr’ -SecretValue $ conStrKey

Awesome, we’re ready to go, right? Mmm, almost, but not quite. Like all things Azure, you need to configure an application that you’ll use to access these secrets. But actually its different (of course) than other apps, in that you don’t pick it from a list of services and select the features you want to use. That may come later, but it’s not here yet.

Grant Your Application Rights to the Secrets

There are a few steps here to get your app rights to the secrets you created.

  1. Create a new application in Azure AD.  I won’t go through all the steps here because they’re documented all over the place.  But the net is you create a new app, say it’s one you are developing, and type in whatever value you want for the sign in and app ID.  After that you need to go copy the client ID and save it somewhere, then create a new key and save it somewhere.  You’ll use these later.
  2. Go back to your PowerShell window and grant rights to read keys and/or secrets to your application.  You can do that with a line of PowerShell that looks like this:

Set-AzureKeyVaultAccessPolicy -VaultName ‘SteveDemo’ -ServicePrincipalName theClientIdOfYourAzureApp -PermissionsToKeys all -PermissionsToSecrets all

In this case I kind of cheated and took the easy way out by granting rights to all possible permissions to my app. If you just wanted it to be able to read secrets then you could configure it that way. One other thing worth noting – one of the most common errors I have seen from folks using this service is that they only remember to grant PermissionsToKeys or PermissionsToSecrets but not both. If you do that then you will get these kind of weird errors that are say something like “operation ‘foo’ is not allowed”. Well, yeah, technically that’s correct. What’s really happening is that you’re forbidden from doing something that you have not expressly granted your application rights to do. So be on the lookout for that.

Use Your Secrets

Okay cool, now that we’ve got our app all set up we can start using these secrets, right? Yes! 🙂  The first thing I would recommend doing is downloading the sample application that shows off performing all of the major Key Vault functions. The latest sample is at http://www.microsoft.com/en-us/download/details.aspx?id=45343. I just know they will change the location any day now and my blog will be out of date, but hey, that’s what it is today. Now there is supposed to be a Nuget package that would allow you to use this more easily from .NET managed code, but I currently cannot find it. If you go get the code download though you will see a sample project that you can easily compile for a nice assembly-based access to the vault.

Going back to something I mentioned at the beginning though – about not “even need(ing) to have your application hosted in Azure” – one of the really great things about this service is that it’s all accessible via REST. So…that gives you LOTS of options, both for working with the content in the vault, as well as the tools, platforms, and hosting options for your applications that use it. For me, I feel that the most common use case by a wide, wide margin is an application that needs to read a secret – like a connection string. So what I did was go through the sample code and reverse engineer out a few things. The result was a much smaller, simpler piece that I use just for retrieving a secret. The pseudo-code goes something like this:

  1. Get an Azure AD access token
    1. Use the client ID and client secret you obtained earlier, and combine that with the Azure AD resource ID for the Key Vault service – which is https://vault.azure.net.
    2. Get the access token by creating a ClientCredential (with the client ID and client secret), and using the resource ID.  The code looks something like this:

var clientCredential = new ClientCredential(client_id, client_secret);
var context = new AuthenticationContext(Authority + authorityContext, null);
var result = context.AcquireToken(“https://vault.azure.net”, clientCredential);

There’s one other REALLY important thing to note here. For those of you familiar with Azure AD and AuthenticationResults, you may be used to getting your them using the Common Login authority. In this case, you will get an AuthenticationResult, but you will get a 403 Forbidden when trying to use it. Mucho painful debugging to figure this out. You MUST use the tenant ID of the Azure AD instance where your application is registered. The tenant ID is just another GUID you can get out of the Azure portal. So for instance in the code above where it’s getting the AuthenticationContext, those two variables look like this:

const string Authority = “https://login.windows.net/”;
string authorityContext = “cc64c719-d217-4c44-dd24-42c18f9cb9f2”;

  1. Create a new HttpClient and plug in the access token from step 1 into an authorization header.  That code looks something like this:

HttpClient hc = new HttpClient();

hc.DefaultRequestHeaders.Authorization = new
System.Net.Http.Headers.AuthenticationHeaderValue(
“Bearer”, accessToken);

  1. Configure the request headers to indicate that you accept application/json, like this:

hc.DefaultRequestHeaders.Accept.Add((new
MediaTypeWithQualityHeaderValue(“application/json”)));

  1. Create the Url you are going to use to access your secret.  Here’s my example:

string targetUri = “https://stevedemo.vault.azure.net/secrets/prodconstr?api-version=2014-12-08-preview”;

A couple of key things to remember about this Url:

  • “stevedemo” is the name of my vault.  This is why it has to be unique in all of Azure.
  • “prodconstr” is the name of the secret I created.  There is also an ability to add a GUID after that which corresponds to a specific version of a secret if needed.  I won’t cover it here, but it’s in the documentation if you need it.
  1. Request your secret.  My code looks like this:

//NOTE:  Using await here without ConfigureAwait causes the
//thing to disappear into the ether
HttpResponseMessage hrm = await hc.GetAsync(new
Uri(targetUri)).ConfigureAwait(false);

  1. Get the results.  My code:

if (hrm.IsSuccessStatusCode)
{
Secret data = await DeserializeAsync<Secret>(hrm);
result = data.Value;
}

In the spirit of being honest… 🙂 …I borrowed the code to deserialize the results into a class. It’s extremely short and the class is quite simple, so it only takes about 30 seconds of copy and paste.

Using the Secret

So there you have it – a pretty quick and easy way to get at your secrets. Using it within my code then is really simple. I have one line of code to retrieve the secret using the code shown above:

conStr = VaultSecret.GetSecret(KEY_VAULT_NAME, KEY_VAULT_SECRET_NAME).Result;

Where VaultSecret is the name of my class where I put the code I showed above, GetSecret is the method I created that contains that code, and, well, hopefully the two parameters are self-explanatory. Overall – good stuff, all things being equal relatively quick to get up and going with it.

Debugging SharePoint Apps That Are Hosted In Windows Azure Web Sites

Today, I’m going to be the lazy human I’m so frequently accused of being by my somewhat faithful dog Shasta, and bring together two posts written by two other folks into one uber “ain’t it cool how this all works together post” by me.  Here are the two concepts we’re combining today:

Now, once our SharePoint App has been published to a Windows Azure web site, the error prone and/or forward-thinking amongst you may be wondering…um, great…so what do I do to track down bugs?  Well that’s where the second piece of brilliant advice that I had nothing to do with comes in.

Now, let’s briefly walk through the steps to combine these two nuggets of goodness:

  1. Create a SharePoint provider hosted app and verify that it works.
  2. Create an Azure web site and download publishing profile. (in Vesa’s video)
  3. Use appregnew.aspx to get a client ID and client secret. (in Vesa’s video)
  4. Publish the App to your Windows Azure site using the publishing profile, client ID and client secret retrieved in the previous steps. (in Vesa’s video)
  5. Create the app package, install it to your app catalog, and add it to your site. (in Vesa’s video)
  6. Open Server Explorer in Visual Studio 2013, right-click on the Windows Azure node and select Connect to Windows Azure…
  7. Expand to see all the Azure services, and then expand the collection of Web Sites.
  8. Right-click on the Azure web site where you published your provider-hosted app and select Attach Debugger.
  9. The browser opens to your Azure web site, and VS.NET starts up in debugging mode.  Set your breakpoints in your code and start debugging!

See the remotely debugging Azure web sites post for the details on pre-requisites, but in short you need Visual Studio 2013 and the Azure 2.2 SDK for VS 2013; you will find a link to that in the blog post. (NOTE:  that same post also describes how to do this with Visual Studio 2012 but I have not tried that)  This actually works pretty great and I was able to get a first-hand experience using it when I went through the steps for this blog post.  As it turns out, the SharePoint site where I installed my sample application uses the Url https://sps2.  Well, the problem of course is that in my Azure Web site, my code was trying to make a CSOM call to an endpoint at “sps2”.  That works great when I’m in my lab environment, but out in the interwebs that Azure lives in of course it cannot resolve to a simple NetBIOS name (remember, this code is running server side, not client side).  So as a result it was blowing up.  By using this cool new debugging feature I was able to find my issue, appropriately for this debugging post.  Here’s a screenshot of it in action:

 

Creating and Using a Certificate for the CSUpload Tool with Azure IaaS Services

In my posting on using SharePoint up in Azure IaaS services (http://blogs.technet.com/b/speschka/archive/2012/06/17/creating-an-azure-persistent-vm-for-an-isolated-sharepoint-farm.aspx), one of my friends – Mike Taghizadeh, who demands that he be mentioned   🙂  – noticed that I didn’t have instructions for how to create a certificate and use that with the csupload command line tool.   So to help those that may be having the same issue I am going to make a quick run through that process here.

To begin with, really the easiest way to create a certificate that you can use for this purpose is to open the IIS Manager on Windows 7 or Windows Server 2008 or later, and create a self-signed certificate in there.  You can find it in IIS Manager by clicking on the server name, then double click on Server Certificates in the middle pane.  That shows all of the installed certificates, and in the right task pane you will see an option to Create Self-Signed Certificate…

After you create your certificate you need to export it twice – once with the private key, and once without.  The reason you do it twice is because you need to upload the certificate without the private key to Azure.  The certificate with the private key needs to be added to your personal certificate store on the computer where you are making your connection to Azure with csupload.  When you create the certificate in the IIS Manager it puts the certificate in the machine’s personal store, that’s why you need to export it and add it to your own personal store.

Exporting the certificates is fairly straightforward – just click on it in the IIS Manager then click on the Details tab of the certificate properties and click the Copy to File… button.  I’m confident you can use the wizard to figure out how to export it with and without the private key.  Once you have the export with the private key (the .pfx file), open the Certificates MMC snap-in and import it into the Personal store for your user account.  For the export without the private key, just navigate to the Azure portal and upload it there.  You want to click on the Hosted Services, Storage Accounts and CDN link in the bottom left navigation, and then click on the Management Certificates in the top left navigation.  Note that if you don’t see these navigation options you are probably viewing the new preview Azure management portal, and you need to switch back to the current Azure management portal.  You can do that by hovering over the green PREVIEW button in the top center of the page, then clicking the link to “Take me to the previous portal”.

When you’re in the Management Certificates section you can upload the certificate you exported (the .cer file).  What’s nice about doing it this way is that you can also copy the subscription ID and certificate thumbprint right out of the portal.  You’ll need both of these when you create the connection string for csupload.  If you click on the subscription, or click on the certificate, you’ll see these values in the right info pane in the Azure Management Portal.  Once you’ve copied the values out, you can plug them into a connection string for csupload like this:

csupload Set-Connection SubscriptionID=yourSubscriptionID;CertificateThumbprint=yourThumbprint;ServiceManagementEndpoint=https://management.core.windows.net

Once you do that you are good to go and start using csupload.  If you get an error message that says:  “Cannot access the certificate specified in the connection string. Verify that the certificate is installed and accessible.  Cannot find a connection string.” – that means that the certificate cannot be found in your user’s Personal certificate store.  Make sure that you have uploaded the .pfx file into it and try again.