The Azure Custom Claim Provider for SharePoint Project Part 1

Hi all, it’s been a while since I’ve added new content about SAML claims, so I decided to come back around and write some more about it in a way that links together some of my favorite topics – SharePoint, SAML, custom claims providers, the CASI Kit and Azure.  This is the first part in a series in which I will deliver a proof of concept, complete with source code that you can freely use as you wish, that will demonstrate building a custom claims provider for SharePoint, that uses Windows Azure as the data source.  At a high level the implementation will look something like this:

  • Users will log into the site using SAML federation with ACS.  On the ACS side I’ll configure a few different identity providers – probably Google, Yahoo and Facebook.  So users will sign in using their Google email address for example, and then once authenticated will be redirected into the site.
  • I’ll use Azure queues to route claim information about users and populate Azure table storage
  • I’ll have a WCF application that I use to front-end requests for data in Azure table storage, as well as to drop off new items in the queue.  We’ll create a trust between the SharePoint site and this WCF application to control who gets in and what they can see and do.
  • On the SharePoint side, I’ll create a custom claims provider.  It will get the list of claim types I support, as well as do the people picker searching and name resolution.  Under the covers it will use the CASI Kit to communicate with Windows Azure.

When we’re done we’ll have a fully end to end SharePoint-to-Cloud integrated environment.  Hope you enjoy the results.

In Part 2, I walked through all of the components that run in the cloud – the data classes that are used to work with Azure table storage and queues, a worker role to read items out of queues and populate table storage, and a WCF front end that lets a client application create new items in the queue as well as do all the standard SharePoint people picker stuff – provide a list of supported claim types, search for claim values and resolve claims.

In Part 3 I create all of the components used in the SharePoint farm.  That includes a custom component based on the CASI Kit that manages all the commnication between SharePoint and Azure.  There is a custom web part that captures information about new users and gets it pushed into an Azure queue.  Finally, there is a custom claims provider that communicates with Azure table storage through a WCF – via the CASI Kit custom component – to enable the type in control and people picker functionality.

Adding Users Programmatically to A Claims Site in SharePoint 2010

I had a friend send me kind of an interesting problem the other day.  He was trying to add a new user programmatically to a Windows claims site and having all sorts of difficulties.  His initial attempt at adding the user with domain\username and the SPRoleAssignment class was not working for him.  He then tried various flavors of providing the claims encoded value for what the user name should be and got that to work partially, but had some strange side effects like the name appearing twice.  While I didn’t have a chance to personally examine all the variations and issues he had, what I did find worked for me on the first try was just to use the EnsureUser method on SPWeb.  It’s much simpler than having to fool around to try and get the encoding for an account name; it’s also a lot easier because you only need to pass in the account name instead of the four parameters you normally have to use to add a user.  EnsureUser automatically takes care of encoding the name and really simplified his code. 

For completeness here’s a short example:

using (SPSite theSite = new SPSite(“http://foo”))
  using (SPWeb theWeb = theSite.OpenWeb())
    SPUser theUser = theWeb.EnsureUser(“domain\username”);

UPDATE:  Make sure you read this post on how you should format the user name:

Facebook DataView WebPart Code from SharePoint Conference

For those of you who attended my SPC 351 session at SharePoint Conference today (Hitting the Ground Running with Claims Authentication), there was a request for some source code.   As promised, I’m attaching a zip file with the source to the Facebook DataView WebPart I demonstrated.  It’s based on using ACS to log into a SharePoint site with Facebook authentication, and then using the Facebook Access Token (that comes in a special SAML claim) to make a request out to Facebook for the user’s public profile info.  The source code is attached for your use and enjoyment.  Hopefully the recorded SPC sessions will be posted somewhere; if not and there is sufficient demand (by my random estimate of your interest based on comments to this post) I may try and find a place to post a separate recording I did of the demo…my backup in case of network connectivity issues.


You can download the attachment here:

The CASI Kit Announcement from SharePoint Conference

Just wanted to update folks with the announcement made at the SharePoint Conference yesterday regarding the CASI Kit.  I have decided to release everything for it – full source code to the base class, the web part, and all of the sample projects – up to CodePlex.  If you go to now you can get everything that makes this toolkit.  In addition to that you will find:

  • A video that walks you through the process of building an application with the CASI Kit
  • The sample project that I built out at SharePoint Conference – both the starting project and completed project – along with written step by step instructions for building it yourself.  The CASI Kit is simple enough to use that the instructions for building the application fit on a single page!
  • All of the written guidance for using the CASI Kit

The reasons for doing this primarily came down to this:

  1. By having the source code available, if you have any issues or find bugs, etc., you have the source code – you can put it in the debugger, you can step through code, you can make changes as needed.  So you should have full comfort that you aren’t just relying on a black box unsupported component; now you can see everything that’s going on and how it’s doing it.
  2. As features in the SharePoint product change over time, having the source code allows you to modify it and change it to stay in step with those changes.  For example, if new ways are added to connect up SharePoint and other cloud services then you can modify the code to take advantage of those new platform features, or even transition off the CASI Kit in a prescriptive manner.  With the source code, you’re in control of adapting to those changes in the future.
  3. You now have the opportunity to build other solutions, whatever you want, using the CASI Kit as is or breaking it apart and using it as a really big building block to your own custom applications. 

Hopefully you will find this source code and kit useful for connecting to Windows Azure and other cloud-based services going forward.  Enjoy!

Retrieving REST Data Using NTLM From a Dual Auth Site in SharePoint 2010

The title of this post actually makes this sound a lot more complicated than the final solution.  It’s really a case of combining the techniques I discussed in two previous posts: and  The short version of the scenario is this – some folks wanted to do a something like a health check ping against a SharePoint site that used SAML authentication.  Previously they had only been working against sites that used on Windows authentication, and as soon as they tried those tools against a site that supported multiple authentication types – SAML and Windows – those tools stopped working.

The point of the health check is just to make a request to a site and make sure that data is returned; if some error code is returned instead then they can start digging into it.  I decided the easiest way to do this was just to make a call to the listdata.svc that is the REST endpoint for the site.  It is something that will always be there, and configuring it to force it into using NTLM in a multi-auth site is something that I figured would be pretty easy, and in fact it was.  The gist of the approach is just to make an HttpWebRequest and add the header I described in the second link above to force it use NTLM.  The result is a fairly straightforward looking chunk of code that looks like this:

string endpoint = UrlTxt.Text + “/_vti_bin/listdata.svc”;
//make a request to the REST interface for the data
HttpWebRequest webRqst = (HttpWebRequest)WebRequest.Create(endpoint);
webRqst.UseDefaultCredentials = true;
webRqst.Method = “GET”;
webRqst.Accept = “*/*”;
webRqst.KeepAlive = true;
webRqst.Headers.Add(“X-FORMS_BASED_AUTH_ACCEPTED”, “f”);

//read the response now
HttpWebResponse webResp = webRqst.GetResponse() as HttpWebResponse;

//make the request and get the response
StreamReader theData = new StreamReader(webResp.GetResponseStream(), true);
string payload = theData.ReadToEnd();

ResultsTxt.Text = payload;

So as you can see, I just create the request, set a few properties and then add my header that tells SharePoint to use Windows auth.  Then I just make my request and I’m good to go.  It’s a pretty simple project, but I’ve attached the complete solution to this posting in case it’s helpful.


You can download the attachment here:

Make Sure You Know This About SharePoint 2010 Claims Authentication – Sticky Sessions Are REQUIRED

Hey folks, I’m here to tell you that I too now have my own story of getting burned by an anomaly of using claims authentication that I wish would have been clearer to me.  This is such a fundamental aspect of deploying it that I want to make sure I call it out front and center here so that the same thing doesn’t happen to you. 

Very simply stated, if you’re using claims authentication, you MUST use affinity in your load balancing solution.  TechNet does describe this, but only as a very brief side note, and not in an appropriately convincing fashion.  The article is at and says this:

Note: If you use SAML token-based authentication with AD FS on a SharePoint Foundation 2010 farm that has multiple Web servers in a load-balanced configuration, there might be an effect on the performance and functionality of client Web-page views. When AD FS provides the authentication token to the client, that token is submitted to SharePoint Foundation 2010 for each permission-restricted page element. If the load-balanced solution is not using affinity, each secured element is authenticated to more than one SharePoint Foundation 2010 server, which might result in rejection of the token. After the token is rejected, SharePoint Foundation 2010 redirects the client to authenticate again back to the AD FS server. After this occurs, an AD FS server might reject multiple requests that are made in a short time period. This behavior is by design, to protect against a denial of service attack. If performance is adversely affected or pages do not load completely, consider setting network load balancing to single affinity. This isolates the requests for SAML tokens to a single Web server.

I’ll take the hit for not noticing this and not taking it more seriously, but I’m blogging about this now so hopefully you won’t have to.  I’ve italicized the words in the note that clearly do not give this justice (nor should it be a note for that matter – it should be in big bold letters).  If you don’t use affinity you will see some of these kinds of issues occur:

  • You may randomly be redirected back to a login page.
  • You may end up in an authentication loop that causes ADFS to halt the request because of a perceived denial of service (DOS) attack, as the note states.
  • If you look at a trace of the activity, you may see SharePoint setting your fedauth cookie to an expired value, then start making the requests again to ADFS, which then, for reasons which are still unclear to me, either won’t issue you a non-expired cookie, or SharePoint looks at and transforms it to an expired cookie.  That’s what kicks off that DOS cycle I described above.  In retrospect now I realize there have been a few cases in the past where folks have asked me about this happening to them, and I realize now that it was probably the lack of sticky sessions that was the culprit.

In short, there should be no confusion or waffling on this issue going forward – for SharePoint 2010, if you are going to use claims authentication, USE AFFINITY WITH YOUR LOAD BALANCER!

UPDATE 6/22/2012 – My friend Mark P. correctly points out that this affinity is required for FBA too, as well as SAML claims.  Make sure you are on top of this for both!

Windows Phone 7 and Other Mobile Device Access to SharePoint 2010 SAML Sites After Applying SP1 and June 2011 CU

As I alluded to in an earlier post, there have been some changes in SP1 and June CU that impact the ability of Windows Phone 7 (both RTM and Mango) and other mobile devices to access SharePoint sites that use SAML authentication.  I described in a different post how to change the compat.browser file to essentially change the behavior when entering a SAML site so that it treats a mobile browser as a regular desktop browser (  That at least lets you authenticate with the site, and then you can use the normal browser views of the site or you can manually navigate to the mobile pages.

When SP1 and June CU came out however, the method of modifying the compat.browser file to allow authentication quit working.  Since that time a number of folks have looked at this and for now there is a new work-around that will provide the same behavior as before.  To do this, you need to modify the web.config file for each web application that you intend to use your mobile devices on.  To get this behavior add the following snippet in the system.web section of the web.config of your Web Application:


type=”System.Web.Mobile.MobileCapabilities, System.Web.Mobile,
Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a” />



Making this change should allow most mobile devices to access a SAML secured site.  It does not, however, enable the Office Hub application in Windows Phone 7 to work with the site.  If or when the story changes on mobile support I’ll update the blog with another post.

Using the WHR Parameter with SharePoint 2010 and SAML Auth

I’ve seen lots of questions and confusion (and was a little lost myself for a bit) on the fixes in SharePoint 2010 SP1 + June CU to enable use of the WHR parameter.  This does in fact work now but requires a couple of things:

  1. Configure the SPTrustedIdentityTokenIssuer

The SPTrustedIdentityTokenIssuer has a property called UseWHomeRealmParameter now; that must be set to true in order for SharePoint to pass the WHR parameter along to the IdP.  Here’s a short powershell snippet that I used to do this:

$ap = get-sptrustedidentitytokenissuer -identity “ADFS with Roles”
$ap.UseWHomeRealmParameter = $true

Pretty simple – now SharePoint will send along any WHR parameter that it finds.

  1. Write or do “something” to append the WHR parameter

In my case I wrote an HttpModule to append the WHR parameter.  Specifically here is how I did it:

  • In the Init override I added a handler for the BeginRequest event
  • In the code for the BeginRequest event I look to see if:
    • The request is headed to _trust/default.aspx AND
    • The request does NOT have a WHR parameter included yet
  • If the request meets the two criteria described above, I create a redirect back to the _trust/default.aspx.  When I do that I:
    • Add every query string parameter that was there before
    • Append my WHR parameter to the end
    • Response.Redirect back to _trust/default.aspx

That was it.  I tested this using an InPrivate browser session in IE because it won’t use the authentication cookies that could otherwise throw off the test results.  So far all tests have worked just as hoped – I’ve also verified in Fiddler that the WHR parameter is now flowing over to ADFS (in my case) as desired.  I’ve attached the source code for my simple litte project to this posting to help get you started.

You can download the attachment here:

A New Twist on an Old Friend – “The security validation for this page is invalid” – in SharePoint 2010

I’m sure many, many, many of you have seen my old friend (okay, exception) that goes something like this:  “The security validation for this page is invalid.  Click Back in your Web browser, refresh the page, and try your operation again.”  It’s like the jelly of the month club – it’s the gift that keeps on giving.  This problem commonly crops up when you are trying to add an item to a list…often in the guts of a RunWithElevatedPrivileges delegate.  Now many folks have found some common problems and common work arounds:

  • Common problems – using RunWithElevatedPrivileges, but using a pre-existing SPSite or SPWeb context.  No joy there because your existing SPSite or SPWeb context doesn’t enjoy the privileges of the subsequently invoked RunWithElevatedPrivileges.
  • Common work around – set the AllowUnsafeUpdates property on the SPWeb to true, add your item to the list, then set the property back to false.  Okay, it works, but it generally makes me feel like I over-indulged on July’s jelly (you know…from my jelly of the month club?? 🙂  Stick with me here).

The rather uncommon work-around I found was in a similar situation – I was using a custom page for the _layouts directory and in the code-behind I was adding a list item.  I got it working the quick and yucky way in about 2 minutes because I’ve seen this problem so many times before (like dating back to SharePoint 2003 when they added the FormDigest control).  This time though I wanted to dig into a bit more to see if I could find an alternative.  This time I did.

As it turns out, you can call SPUtility.ValidateFormDigest() right before you start your RunWithElevatedPrivileges code.  That gets the form validation code in the stack and lets the add continue successfully.  In my particular case I was inheriting from LayoutsPageBase and my aspx markup was configured like any other _layouts page, in that it used a master page that already contained a <form> element and a <FormDigest> control instance.  If you are missing either of these in your page or master then you will need to add them as well.  The AllowUnsafeUpdates code is not needed any longer and so the whole page and process is just that much safer.  Take a look at this approach the next time you are working in a similar scenario.

Working with Social Tags in SharePoint 2010

NOTE:  Please download the attachment to get a Word doc with this posting in human readable format.

I had an interesting request recently from someone that wanted help essentially being able to migrate social tags between two different farms.  There are some things out of the box that can help you do this, and some things that are tantalizing, but not quite sufficient.

I tested getting the social tags themselves without any kind of “real” impersonation.  It worked for me but I can’t guarantee that it will in all cases because I was logged in as UPA admin.  If it does not, then you could use the same impersonation technique I’ll describe down below.  To get the tags for a user, I just tried to create a SPSite context as that user, then create all the context objects needed to retrieve tags for that user, like this:

SPUserToken sut = null;

//get the user token for user first so we can

//use that to get a site context as that user

using (SPSite userSite = new SPSite(UrlTxt.Text))


   using (SPWeb rootWeb = userSite.RootWeb)


       SPUser socialUser =


       sut = socialUser.UserToken;



UPDATE:  Make sure you read this post to see how you should format the user name for EnsureUser:                

//now get the site as that user – NOTE: these are

//all class scoped variables

using (SPSite newSite = new SPSite(UrlTxt.Text, sut))


   sc = SPServiceContext.GetContext(newSite);

   stm = new SocialTagManager(sc);

   upm = new UserProfileManager(sc);

   up = upm.GetUserProfile(AccountTxt.Text);



Once I have the contexts that were created as the current user, then it’s pretty easy to get the tags for that user:

SocialTag[] tags = stm.GetTags(up);



foreach (SocialTag tag in tags)


   TagLst.Items.Add(tag.Term.Name + ” – ” + tag.Url.ToString());



This part was fairly straightforward and easy.  Writing a social tag for a different user was unfortunately not nearly as easy.  The SocialTagManager includes an AddTag method, but it doesn’t provide an overload that includes a UserProfile like the GetTags method does.  This is a tough oversight, and unfortunately using the context of the user that was passed into the new SPSite constructor does not help.  As a result, you need to use impersonation to do this.  In this case I just reused the approach I described in this posting –  I configured the CTWTS to allow me (since the my application was running in my user context) to impersonate users.  Specific details on how to do that are described here:

So with that approach in mind, here’s how I did the impersonation first:

//start the impersonation

//create the WindowsIdentity for impersonation

WindowsIdentity wid = null;




   wid = S4UClient.UpnLogon(EmailTxt.Text);


catch (SecurityAccessDeniedException adEx)


   MessageBox.Show(“Could not map the Email to ” +

       “a valid windows identity: ” + adEx.Message);



//see if we were able to successfully login

if (wid != null)


   using (WindowsImpersonationContext ctx = wid.Impersonate())


       //code goes here to add a new tag





   MessageBox.Show(“Couldn’t impersonate user – can’t add tag.”);



The impersonation code itself is not particularly complex – you just need the user’s Email address (which you could probably get from their UserProfile, which this code retrieves).  You also need the CTWTS running, which installs on each SharePoint server, and this code needs to run on a SharePoint server since it uses the object model.  So again, this should not be a significant hurdle.

Finally, adding the new tag for the user does require jumping through a few hoops but is pretty manageable.  Here’s what that code looks like:

//this is the code that gets the SPSite, SPServiceContext, etc



//work with the taxonomy classes so we

//can reuse any existing term, or create a

//new one if it doesn’t exist

TaxonomySession txs = stm.TaxonomySession;

TermStore ts = txs.DefaultKeywordsTermStore;


TermCollection terms =


   ts.DefaultLanguage, true);


Term t = null;


if (terms.Count == 0)


   t = ts.KeywordsTermSet.CreateTerm(TagTxt.Text,





   t = terms[0];


//add the tag

stm.AddTag(new Uri(TagUrlTxt.Text), t);


So what happens here is we just look in the taxonomy store for the tag that is being added.  If we find it in there then we use that term, if not we create it and add it to the term store.  Then we add the term and associate it with a Url and it gets added to that user’s social tags.

Overall the code and approach is fairly straightforward.  The main problem here is just working around the inability to specify to which user a social tag should be added.  The impersonation code and CTWTS takes care of that part for us without requiring the password for each user.  The source code for this project is attached to the posting.


You can download the attachment here: