The SPMigrateUsers Tool for Changing Account Identities in SharePoint 2010

There are times in SharePoint when you want or need to change an account identity.  The best example is with SAML claims.  In virtually of my examples I use email address as the identity claim for users.  I do this because a) most people have an email address and b) an email address is something that most users understand.  However, I get push back every now and then about using email address because people can change it.  When you change your email address, then obviously all the permissions you have would break.  To be honest, I don’t consider this to be a frequent scenario, or I wouldn’t use email address to begin with.  However I will grant you that it does occasionally happen, so what do you do when all of your SharePoint is secured with email addresses?

The key to controlling this is in a blog post I did previously on the IMigrateUserCallback interface:  http://blogs.technet.com/b/speschka/archive/2011/01/27/migrating-user-accounts-from-windows-claims-to-saml-claims.aspx.  In that post I describe how to migrate identities using this interface and provided an example of how to convert a Windows claims identity to a SAML claims identity.  What I decided to do is just write a little Windows application that will let you enter the credentials that you want to change and it will make the modification for you.  It’s goal in life is to be used as a simple one-off tool for making these changes, however you could easily take the source code (which I’m including with this post) and modify it to do something more creative, such as reading in a list of users from a file or database and doing the comparisons yourself.

What’s also nice about this tool though is that it can actually be used for multiple scenarios.  So not only can you use it to convert from one email address to another, you can also use it to convert from one group name to another.  This is another case where certain Ops folks tell me, hey, you should use SIDs for the group names (i.e. Role claims in SAML) because if you rename the group the SID remains the same.  While that’s also true, again, a) I don’t see that happen a ton, b) how many of you want to start typing in group SID names and adding them to SharePoint groups (please seek therapy now if you answer yes) and c) SIDs mean nothing outside of your local Active Directory – as soon as you move into a cloud-based service like Azure, Google, Yahoo, Facebook, etc. your SID will be as useless as [fill in your own “useless as a …” joke here].

The other thing that’s nice about this tool is that it doesn’t restrict you to just making changes within a single type of provider.  Want to change a Windows group to a SAML role claim?  You can do that.  Want to change a SAML identity claim to an FBA membership user?  You can do that.  Want to change an FBA role to an AD group?  You can do that.  You get the idea – I’ve tried just about every combination of different “users” and “groups” between different providers and so far all have converted back and forth successfully.

The tool itself is hopefully pretty straightforward to use; here’s a picture of it:

When the application first starts up it loads up a list of all the web applications.  For each web application it populates the two combo boxes below it with a list of all the providers being used on that web application.  If you have multiple SAML providers or multiple FBA providers, each one will be listed in the drop down.  You simply choose which provider you are migrating from and which you are migrating to.  In the Claim Value section you type in the value that want to migrate, and what you want to migrate it to.  Just type the value in the Plain Text Value edit fields and click either the identity claim button (the one on the left) or the group claim button (the one on the right).  The description in the text gives a full explanation of this and the text on the buttons changes so it makes more sense depending on which identity provider you are using.

For example, suppose you are only using SAML authentication and wanted to migrate the email address “steve@contoso.com” to “stevep@contoso.com”.  You would pick your web application and the SAML authentication provider would be selected by default in each drop down.  Then in the Before Values section you would type “steve@contoso.com” in the Plain Text Value edit and click the ID Claim button; that puts the correct encoded claim value in the Encoded Value edit.  Next you would type “stevep@contoso.com” in the After Values Plain Text Value edit.  Click on the ID Claim button again it puts the correct value in the Encoded Value edit box (NOTE:  In the picture above the button in the After Values section says “User” instead of “ID Claim” because in that example it is migrating from SAML claims to Windows claims).  Once all of your values have been provided just click the Migrate button to complete the process; a message box will appear informing you when the migration is complete.

In the process of testing this across several different web applications and several different authentication types I did run across a couple of issues that I want to raise here in case you see them as well.  In one case I got an Access Denied error message when trying to migrate the users for one particular web application.  I never was able to track down why this was occurring so the best I can say is that something is wonky in that web app, but I’m not sure what because it worked on all the four or five other web apps I tried in my farm.

The second thing is that in one case the migration said it completed successfully but I could not log in as the migrated user.  In digging into it further I found that the account I was migrating from was not being pushed through the IMigrateUserCallback function (i.e. it’s a SharePoint problem, not a coding problem with this application).  If that happens to you I recommend using the source code and Visual Studio to step through the debugger to make sure the account you’re migrating from is getting called.  Unfortunately I had one lonely FBA membership user that got stuck alone in the wilderness.

Finally one last thing to note – don’t freak out if you migrate an account from one value to another, then log in as the new user and see the old account name, etc. in the welcome control in the top right corner of the page.  The migrate function just changes the account name.  If the other user information changes at all then as long as you update the user profiles the correct information should get pushed down to all the site collections on the next sync with the profile system.

That’s it – have it and hope it helps.  As I said above the complete source code is included so feel free to play with it and modify as needed for your scenario.

Here’s a link to the source code:

 

Creating an Azure Persistent VM for an Isolated SharePoint Farm

The first step in being able to create a persistent VM in Azure is to get your account upgraded to take advantage of these features, which are all in preview.  Once the features are enabled you can follow this process to get the various components configured to support running an isolated SharePoint farm. 

In this case, by “isolated farm” I mean one in which there are two virtual images.  In my scenario one image is running Active Directory, DNS and ADFS.  The other image is running SharePoint 2010 and SQL 2012.  The second image is joined to the forest running on the first image.  The IP address used by the domain controller (SRDC) is 192.168.30.100; the IP address for the SharePoint server (SRSP) is 192.168.30.150 and 192.168.30.151.

IMPORTANT:  Make sure you enable Remote Desktop on your images before uploading them to Azure (it is disabled by default).  Without it, you will find it very difficult to manage your farm, specifically things like Active Directory and ADFS.

  1. Create a new Virtual Network.  Since I’m talking about an isolated farm here, that implies that I do not want to connect or integrate with my corporate identity or name services (like AD and DNS).  Instead I’m going to configure a new virtual network for my small farm.  Here are the sub steps for creating a virtual network for this scenario:
    1. Click New…Network…Custom Create.
    2. Type in a Name for the network, then from the Affinity Group drop down select Create a new affinity group.  Select either West US or East US as the region and type a name for the affinity group.  In this example I use the name SamlAffinity for the affinity group and   SamlNetwork for the network name.  Click the Next button.
    3. Enter the address space in CIDR format for the IP address you want to use then click the Next button.  Now your first question may be what is CIDR format?  That’s beyond the scope of this, but suffice to say that you can figure out CIDR format by going to a web site that will calculate it for you, like http://ip2cidr.com/.  In this case I wanted to use the entire 192.168.30.0 subnet, so I entered 192.168.30.0/24.  Note that you can optionally also create different subnets of use within your virtual network, but it is not needed for this scenario so I did not do it.  Click the Next button.
    4. For this particular scenario you can skip selecting a DNS server because we’ll configure it on the servers themselves once they’re up and running.  Click the Finish button to complete this task and create the virtual network.
  2. Create a new storage account to store your virtual machines.  This step is fairly straightforward – in the new management portal click on New…Storage…Quick Create.  Give it a unique name and click on the Region/Affinity Group drop down.  Select the affinity group you created in the previous step, then click the Create Storage Account button.  For this example, I’ve called my new storage account samlvms.
  3. Upload your images that you will be using.  In this case I have two images – SRDC and SRSP – that need to push up to the storage account I created earlier – samlvms.   Uploading the image can be done with the csupload tool that is part of the Windows Azure 1.7 SDK, which you can get from https://www.windowsazure.com/en-us/develop/other/.  The documentation for using csupload can be found at http://msdn.microsoft.com/en-us/library/windowsazure/gg466228.aspx.  Detailed instructions on creating and uploading a VHD image using this new command can also be found at http://www.windowsazure.com/en-us/manage/windows/common-tasks/upload-a-vhd/.  A few other notes:
    1. UPDATE:  I added instructions for how to create the certificate used with the csupload tool.  You can find it at http://blogs.technet.com/b/speschka/archive/2012/09/15/creating-and-using-a-certificate-for-the-csupload-tool-with-azure-iaas-services.aspx.
    2. In this case I’m using images that already are ready to go – they are not sysprepped, they have a domain controller or SharePoint and SQL installed and just need to be started up.  You can use that as the basis for a new virtual machine but you need to use the Add-Disk command instead of the Add-PersistedVMImage command.  Use the latter if you have a sysprepped image upon which you want to base new images.
    3. Figuring out your management certificate thumbprint when you create the connection can be somewhat mystical.  The detailed instructions above include information on how to get this.  In addition, if you have already been publishing applications with Visual Studio then you can use the same certificate it does.  You have to go into Visual Studio, select Publish, then in the account drop down click on the Manage… link.  From there you can get the certificate that’s used.  If you are trying to use csupload on a different machine then you’ll also need to copy it (including the private key) and then move it to where ever you are using csupload.  Once you copy it over you need to add it to your personal certificate store; otherwise csupload will complain that it is unable to find a matching thumbprint or certificate.
    4. Here’s an example of the commands I used:
      1. csupload Set-Connection “SubscriptionID=mySubscriptionID;CertificateThumbprint=myThumbprintDetails;ServiceManagementEndpoint=https://management.core.windows.net”
      2. csupload Add-Disk -Destination “http://samlvms.blob.core.windows.net/srsp.vhd” -Label “SAML SharePoint” -LiteralPath “C:\srsp.vhd” -OS Windows -Overwrite
      3. csupload Add-Disk -Destination “http://samlvms.blob.core.windows.net /srdc.vhd” -Label “SAML DC” -LiteralPath “C:\ srdc.vhd” -OS Windows -Overwrite
  4. Once the images are uploaded, you can create new virtual machines based on them.
  5. Click on the New…Virtual Machine…From Gallery.
  6. Click on My Disks on the left, and then select the image you want to create from your image library on the right, then click the Next button.
  7. Type a machine name and select a machine size, then click the Next button.
    1. Select standalone virtual machine (unless you are connecting to an existing one) and enter an available DNS name, select your region and subscription, then click the Next button
    2. Either use no availability set, select an existing one, or create a new one; when finished, click the Finish button to complete the wizard.

Your images may go through multiple states, including “Stopped”, before it finally enters the running state.  Once it starts running, you need to give it a couple minutes or so to boot up, and then you can select it in the Azure portal and click the Connect button on the bottom of the page.  That creates and downloads and RDP connection that you can use to connect to your image and work with it.

It’s also important to note that your network settings are not preserved.  What I mean by that is my images were using static IP addresses, but after restarting the images in Azure they were using DHCP and getting local addresses, so the images require some reconfiguration to work.

Networking Changes

The networking configuration is changed for the images once they are started in Azure.  Azure persistent VMs use DHCP, but the leases last indefinitely so it acts very similar to fixed IP addresses.  One of the big limits though is that you can only have one IP address per machine, so that means the second lab for the SAML Ramp will not be feasible.

To begin with though you need to correct DNS and the domain controller, so RDP into the domain controller first (SRDC in my scenario).  Restart the Net Logon service, either through the Services applet or in a command prompt by typing net stop netlogon followed by net start netlogon.   This will reset your new DHCP address as one of the host addresses for the domain.  Next you need to delete the old host address for the domain, which for me was 192.168.30.100.  Open up DNS Manager and then double-click on the Forward Lookup Zone for your domain.  Find the host (A) record with the old address, 192.168.30.100 in my case,  (it will also say “(same as parent folder)” in the Name column) and delete it. 

Next you need to change the DNS server for your network adapter to point to the DHCP address that was assigned to the image.  Open a command prompt and type ipconfig and press Enter.  The IPv4 Address that is shown is what needs to be used as the DNS server address.  To change it, right click on the network icon in the taskbar and select Open Network and Sharing Center.  Click on the change adapter settings link.  Right-click on the adapter and choose Properties.

When the Properties dialog opens, uncheck the box next to Internet Protocol Version 6.  Click on Internet Protocol Version 4 but DO NOT uncheck the box, then click on the Properties button.  In the DNS section click on the radio button that says Use the following DNS server addresses and for the Preferred DNS server enter the DHCP address for the SRDC server that you retrieved using ipconfig.  Click the OK button to close the Internet Protocol Version 4 Properties dialog, then click the OK button again to close the network adapter Properties dialog.  You can now close the Network Connections window.

Now if you open a command prompt and type ping your Active Directory forest name it should resolve the name and respond with a ping; on my image it responded with address 192.168.30.4.

On the SharePoint server you just need to change the Primary DNS server IP address to the IP address of the domain controller, which in this example was 192.168.30.4.  After doing so you should be able to ping your domain controller name and Active Directory forest name.  Once this is working you need to get the new IP address that’s been assigned to the SharePoint server and update DNS on the domain controller if you used any static host names for your SharePoint sites.  One limitation that could NOT be addressed in this scenario is the fact that my SharePoint server used multiple IP addresses; persistent images in Azure currently only support a single IP address.

403 Forbidden Errors When Failing Over a SQL 2012 Availability Group with SharePoint 2010

I just had a heck of a time getting failover of a SQL 2012 Availability Group to work correctly with SharePoint 2010, so I thought I would share the outcome in case it helps anyone else.  In short, I had my SQL 2012 Availability Group all set up and it appeared to be working correctly.  I created a new content database on the primary node in the group, then backed it up and added it to the list of databases managed by the Availability Group (AG).  So far, so good.  I could hit the SharePoint site and it rendered just fine.  However, after I failed over the AG to a new node, my SharePoint site would not come up any more.  Instead, I would get a 403 Forbidden error instead of the page content.  What was really vexing though is that I could open up SQL Server Manager and connect to my AG Listener just fine – I could query and get results for any of the tables in my content database that was now hosted on a different server.

After spending mucho time trying to figure this out, my friend and resident SQL nut job (in a good way!) Bryan P. pointed out that while the database account for my app pool account had moved over with my database, the SQL login did not.  What I mean by that is if I look in SQL Manager at the content database and look at Security…Users  I will see the SQL account for the app pool.  However, if I look at the top level Security node for the server and then Logins, there is not a corresponding login account for the app pool account.  So, I just created the login for the app pool account and then granted it rights to the content databases I was managing with the AG.  After making that change, everything worked fine on the SharePoint side – I can now fail over to any node in the cluster and my SharePoint site continues to work just fine.

This sounds like a good fact to be aware of, especially as you are creating app pools with new accounts and want your content databases to be protected with an AG – make sure you add those new accounts to the logins for each SQL 2012 server that is participating in your AGs.

Getting Welcome Emails to Work with a Custom Claims Provider in SharePoint 2010

A good “friend of the blog”, Israel V., was good enough to point out to me recently that pretty much all of the code samples that we have for custom claims providers contain an irritating little flaw – if you follow these samples then the welcome emails that get sent out when you add a new person to a site will not be sent out.  I, of course, am as guilty as anyone of this, so I took a look at the situation a little closer, as well as a quick review of some code that Israel had developed to work around this problem.

In a nutshell, you will see this problem happen if you are adding a user to a site collection for the very first time and so there is no email address associated with that person – because a profile sync hasn’t occurred or whatever.  So, as you can imagine the key here (and I’m boiling this down to the simplest case scenario) is to grab an email address for the user at the time they are added and then plug it into the appropriate property in your PickerEntity class.  Now let’s talk about some of the particulars.

WHERE you get the email address from is going to totally depend on your claims provider.  If you’re pulling your data from Active Directory then you can query AD to get it.  If you’re using SAML and email address is the identity claim, then you can simply reuse that.  Basically, “it depends” so you’ll need to make the call here.

WHEN you want to use it is when the FillResolve method is called.  As you know, this method can be called either after someone adds an entry via the People Picker or when they type a value in the type in control and click the resolve button.  As I’ve shown in many of my code samples, during that process you will create an instance of the PickerEntity class so that you can add it to the List<PickerEntity> that is passed into the method.

HOW you add it is just to set the property on the PickerEntity instance like so:

//needed to make welcome emails work:

pe.EntityData[PeopleEditorEntityDataKeys.Email] = “steve@stevepeschka.com”;

In this example “pe” is just the instance of the PickerEntity class that I created and return to my FillResolve method.

That’s all there really is too it.  The biggest trick may very well be just getting the email address value.  Once you have it though it’s pretty easy to add it to the PickerEntity to ensure your welcome emails will work.  I tested this out and verified that both a) the welcome emails were not getting sent out with my original custom claims provider and b) they DID start going out after incorporating this change.  Thanks again to Israel V. for the heads up and code sample on this issue.

The Issuer of a Token is not a Trusted Issuer Craziness with SAML Claims in SharePoint 2010

Let’s be honest – every now and then SharePoint lies to us.

Case in point – I was working with my friend Nidhish today, getting SAML working on a SharePoint site.  We started out be getting a strange HTTP 500 error when we hit the site.  That in and of itself is unusual in my experience.  So to try and understand the issue better we cracked open the ULS logs and found this error:  “The issuer of the token is not a trusted issuer.”  Now having set up SAML in SharePoint approximately 3,492,234 times, I was fairly confident that we had configured the certificates correctly.  Nonetheless, we then spent a fair amount of time looking at the certificates we had registered with the SPTrustedRootAuthority, comparing certificate thumbprints, double-checking the certificates in ADFS, recycling services and boxes etc.  Just made absolutely no sense at all because every aspect of the certificate configuration appeared to be correct.

Finally I decided to review all of the relying party settings in ADFS again, and that’s where I found the “real” problem.  Turns out the WS-Fed endpoint for the relying party was mistakenly set to “https://foo”, instead of “https://foo/_trust”.  All the certificates were in fact correct, but the request was getting redirected to the root instead of the _trust directory.  Once the WS-Fed endpoint was updated everything began working.  Just a little nugget that you may find helpful sometime.

Finally A USEFUL Way to Federate With Windows Live and SharePoint 2010 Using OAuth and SAML

Lots of folks have talked to me in the past about federating SharePoint with Windows Live.  On the surface it seems like a pretty good idea – Windows Live has millions of users, everyone logs in with their email address, which is something we use a lot as an identity claim, it’s a big scalable service, and we have various instructions out there for how to do it – either directly or via ACS (Access Control Service).  So why might I be so grumpy about using it with SharePoint?  Well, for those of you that have tried it before you know – when you federate with Windows Live you never get a user’s email address back as a claim.  All  you get is a special Windows Live identifier that is called a PUID.  As far as I know, “PUID” should stand for “Practically GUID”, because that’s pretty much what it looks like and about how useful it is. 

For example, if you DO federate with Windows Live, how do you add someone to a site?  You have to get their PUID, and then add the PUID to a SharePoint group or permission level.  Do you seriously know anyone that knows what their PUID is (if you are such a person, it’s time to find something else to do with your free time).  Even if you did magically happen to know what your PUID is, how useful do you think that is if you’re trying to grant users rights to different site collections?  Do you really think anyone else could pick you out of a PUID lineup (or people picker, as the case may be)?  Of course not!  And thus my frustration with it grows.

I actually thought that we might have a shot here at a more utopian solution with ACS.  ACS is really great in terms of providing out of the box hooks to several identity providers like Windows Live, Google, Yahoo, and Facebook.  With Facebook they even sprinkle a little magic on it and actually use OAuth to authenticate and then return a set of SAML claims.  Very cool!  So why don’t they do that with Windows Live as well?  Windows Live supports OAuth now so it seems like there’s an opportunity for something valuable to finally happen.  Well despite wishing it were so, the ACS folks have not come to the rescue here.  And therein lies the point of this preamble – I finally decided to just write one myself, and that is the point of this posting.

So why do we care about OAuth?  Well, contrary to the PUID you get when federating directly with Windows Live, OAuth support in Windows Live allows you to get a LOT more information about the user, including – wait for it – their email address.  So the plan of attack here is basically this:

  1. Write a custom Identity Provider using the Windows Identity Foundation (WIF).
  2. When a person is redirected to our STS, if they haven’t authenticated yet we redirect them again to Windows Live.  You have to create “an application” with Windows Live in order to do this, but I’ll explain more about that later.
  3. Once they are authenticated they get redirected back to the custom STS.  When they come back, the query string includes a login token; that login token can be exchanged for an access token.
  4. The STS then makes another request to Windows Live with the login code and asks for an access token.
  5. When it gets the access token back, it makes a final request to Windows Live with the access token and asks for some basic information about the user (I’ll explain what we get back later).
  6. Once we have the user information back from Windows Live, we use our custom STS to create a set of SAML claims for the user and populate it with the user info.  Then we redirect back to whatever application asked us to authenticate to begin with to let it do what it wants with the SAML tokens.  In this particular case I tested my STS with both a standard ASP.NET application as well as a SharePoint 2010 web app.

So…all the source code is attached to this posting, but there’s still some configuration to do, and you will have to recompile the application with the app ID and secret that you get from Windows Live, but other than doing that copy and paste there really isn’t any code you need to write to get going.  Now lets walk through everything you need to use it.

Create a Token Signing Certificate

You will need to create a certificate that will use to sign your SAML tokens.  There’s nothing special about the certificate you use to sign certificates, other than you need to make sure you have the private key for it.  In my case I have Certificate Services installed in my domain so I just opened the IIS Manager and selected the option to create a Domain Certificate.  I followed the wizard and before you know it I had a new certificate complete with private key.  For this project, I created a certificate called livevbtoys. 

As I’ll explain in the next section, when requests initially come into the STS the user is an anonymous user.  In order to use that certificate to sign SAML tokens then we need to grant the IIS process access to the private key for that certificate.  When an anonymous request comes in the IIS process identity is Network Service.  To give it rights to the key you need to:

  1. Start the MMC
  2. Add the Certificates snap-in.  Select the Computer store for the local computer.
  3. Open up the Personal…Certificates store and find the certificate you created for signing SAML tokens.  If you created as I explained above the certificate will be in there by default.  If you create it some other way you may need to add it to that store.
  4. Right click on the certificate and choose the option to Manage Private Keys.
  5. In the list of users that have rights to the keys, add Network Service and give it Read rights to it.

Note that if you don’t do this correctly, when you try running the application you may get an error that says something like “keyset does not exist”.  That just means that IIS process did not have sufficient rights to the private key, so it could not use it to sign the SAML token.

Install the Application and Required Assemblies

Installing the application in this sense really just means creating an ASP.NET application in IIS, copying the bits, and making sure the latest version of WIF is installed.  Once you get it configured and working on one server of course, you would want to add one or more additional servers to make sure you have a fault tolerant solution.  But I’ll just walk through the configuration needed on the single server.

I won’t go into how you create an ASP.NET application in IIS.  You can do with Visual Studio, in the IIS Manager, etc. 

NOTE:  If you use the code that’s provided here and just open the project in Visual Studio, it will complain about the host or site not existing. That’s because it’s using the name from my server.  The easiest way to fix this is just to manually edit the WindowsLiveOauthSts.sln file and change the https values in there to ones that actually exist in your environment.

Once it’s actually created there are a few things you want to make sure you do.

  1. Add PassiveSTS.aspx as the default document in the IIS Manager for the STS web site.
  2. Change the Authentication settings for the application in IIS so that all authentication types are disabled except for Anonymous Authentication.
  3. The STS needs to run over SSL, so you will need to acquire an appropriate certificate for that and make sure you update the bindings on the IIS virtual server where the custom STS application is used.
  4. Make sure you put the thumbprint of your token signing certificate in the thumbprint attribute of the add element in the trustedIssuers section of the web.config of your relying party (if you are NOT using SharePoint to test).  If you use the Add STS Reference wizard in Visual Studio it will do this for you.

That should be all of the configuration needed in IIS.

Update and Build the Custom STS Project

The attached zip file includes a Visual Studio 2010 project called WindowsLiveOauthSts.  Once IIS is configured and you’ve updated the WindowsLiveOauthSts.sln file as describe above, you should be able to open the project successfully in Visual Studio.  One of the first things you’ll need to do is to update the CLIENT_ID and CLIENT_SECRET constants in the PassiveSTS.aspx.cs class.  You get these when you create a new Windows Live application.  While I’m not going to cover that step-by-step (because there are folks at Windows Live who can help you with it), let me just point you to the location where you can go to create your Windows Live app:  https://manage.dev.live.com/Applications/Index?wa=wsignin1.0.  Also, when you create your application, make sure you set the Redirect Domain to the location where your custom STS is hosted, i.e. https://myserver.foo.com.

Now that you have your ID and secret here’s what needs to be updated in the application:

  1. Update the CLIENT_ID and CLIENT_SECRET constants in the PassiveSTS.aspx.cs class.
  2. In the web.config file update the SigningCertificateName in the appSettings section.  Note that you don’t have to change the IssuerName setting but you obviously can if you want.
  3. Update the token signing certificate for the FederationMetadata.xml document in the STS project.  Once you’ve selected the certificate you’re going to use, you can use the test.exe application included in this posting to get the string value for the certificate.  It needs to be copied in to replace the two X509Certificate element values in federationmetadata.xml.

There’s one other thing worth pointing out here – in the CustomSecurityTokenService.cs file you have the option of setting a variable called enableAppliesToValidation to true and then providing a list of Urls that can use this custom STS.  In my case I have chosen not to restrict it in any way, so that variable is false.  If you do want to lock down your custom STS then you should change that now.  Once all of these changes have been made you can recompile the application and it’s ready to go.

One other note here – I also included a sample ASP.NET application that I used for testing while I was building this.  It’s in a project called LiveRP.  I’m not really going to cover it in here; suffice to say it’s there if you want to try testing things out.  Just remember to change the thumbprint for the STS token signing certificate as described above.

SharePoint Configuration

At this point everything is configured and should be working for the custom STS.  The only thing left to do really is to create a new SPTrustedIdentityToken issuer in SharePoint and configure a new or existing web application to use it.  There are a few things you should know about configuring the SPTrustedIdentityTokenIssuer though; I’m going to give you the PowerShell that I used to create mine and then explain it:

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2(“c:\livevbtoys.cer”)

New-SPTrustedRootAuthority -Name “SPS Live Token Signing Certificate” -Certificate $cert

 

$map = New-SPClaimTypeMapping -IncomingClaimType “http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress&#8221; -IncomingClaimTypeDisplayName “EmailAddress” -SameAsIncoming

$map2 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/id&#8221; -IncomingClaimTypeDisplayName “WindowsLiveID” -SameAsIncoming

$map3 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/full_name&#8221; -IncomingClaimTypeDisplayName “FullName” -SameAsIncoming

$map4 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/first_name&#8221; -IncomingClaimTypeDisplayName “FirstName” -SameAsIncoming

$map5 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/last_name&#8221; -IncomingClaimTypeDisplayName “LastName” -SameAsIncoming

$map6 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/link&#8221; -IncomingClaimTypeDisplayName “Link” -SameAsIncoming

$map7 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/gender&#8221; -IncomingClaimTypeDisplayName “Gender” -SameAsIncoming

$map8 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/locale&#8221; -IncomingClaimTypeDisplayName “Locale” -SameAsIncoming

$map9 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/updated_time&#8221; -IncomingClaimTypeDisplayName “WindowsLiveLastUpdatedTime” -SameAsIncoming

$map10 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/account&#8221; -IncomingClaimTypeDisplayName “AccountName” -SameAsIncoming

$map11 = New-SPClaimTypeMapping -IncomingClaimType “http://blogs.technet.com/b/speschka/claims/accesstoken&#8221; -IncomingClaimTypeDisplayName “WindowsLiveAccessToken” -SameAsIncoming

$realm = “https://spslive.vbtoys.com/_trust/&#8221;

$ap = New-SPTrustedIdentityTokenIssuer -Name “SpsLive” -Description “Window Live oAuth Identity Provider for SAML” -realm $realm -ImportTrustCertificate $cert -ClaimsMappings $map,$map2,$map3,$map4,$map5,$map6,$map7,$map8,$map9,$map10,$map11 -SignInUrl “https://spr200.vbtoys.com/WindowsLiveOauthSts/PassiveSTS.aspx&#8221; -IdentifierClaim “http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress&#8221;

Here are the things worth noting:

  1. As I stated above, I created a certificate called livevbtoys.cer to sign my tokens with, so I added that to my SPTrustedRootAuthority list and then associate it with my token issuer.
  2. I created claims mappings for all of the claims that my custom STS is returning.  As you can see, it’s SIGNIFICANTLY MORE AND BETTER than you would ever get if you just federated directly to Windows Live.  One other thing to note here – I include the access token that I got from Windows Live as a claim here.  While that works with Facebook, I haven’t tested it so I can’t say for sure if Windows Live will let you reuse it or not.  But maybe that will be the topic of a future post.
  3. The $realm value is critically important.  It must point to the root site of your web application, and include the /_trust/ directory. If you do this wrong, you will just get 500 errors from SharePoint when you get redirected back after authentication.
  4. The –SignInUrl parameter when creating the token issuer is the absolute Url to PassiveSTS.aspx page for my custom STS.

That’s pretty much it – once it’s set up you are still using the out of the box people picker and claims providers so you won’t have any lookup capabilities, as you would expect.  You grant rights to people with the email addresses that they use to sign into Windows Live.  You could actually extend this example and also use the Azure claims provider I blogged about here:  http://blogs.technet.com/b/speschka/archive/2012/02/11/the-azure-custom-claim-provider-for-sharepoint-project-part-1.aspx.  That means you would be using this STS to enable you to authenticate with Windows Live and get some real SAML claims back, and then using the Azure custom claims provider project to add those authenticated users into your Azure directory store and the people picker to choose them.

The pictures tell it all, so here’s what it looks like when you first hit the SharePoint site and authenticate with Windows Live:

When you first sign in it will ask you if it’s okay to share your information with the custom STS application.  There’s nothing to concerned with here – that’s standard OAuth permissions happening.  Here’s what that looks like; note that it shows the data I’m asking for in the STS – you could ask for an entirely different set of data if you wanted.  You just need to look at the Window Live OAuth SDK to figure out what you need to change and how:

Once you accept, you get redirected back to the SharePoint site.  In this example I am using the SharePoint Claims web part I blogged about here:  http://blogs.technet.com/b/speschka/archive/2010/02/13/figuring-out-what-claims-you-have-in-sharepoint-2010.aspx.  You can see all the claims I got from Windows Live via OAuth that I now have as SAML claims thanks to my custom STS, as well as the fact that I’m signed in with my Windows Live email address that I created for this project (from the sign in control, top right corner):

 

You can download the attachment here:

One More Claims Migration Gotcha For SharePoint 2010

Hey folks, I’ve written previously about how to migrate code for claims users (such as Windows claims to SAML claims) in this post about the IMigrateUserCallback interface:  http://blogs.technet.com/b/speschka/archive/2011/01/27/migrating-user-accounts-from-windows-claims-to-saml-claims.aspx.  Just as with that post, our good friend Raju S. also had some other interesting information to add to this content today.  One of our other “friends of the blog” Israel V. noticed that after a recent migration he did the identities for workflows were not updated.  Turns out Raju had seen this before in a previous version of SharePoint (when migrating between different domains) and had done some code to fix up that issue.  The net of what you need to do here is go through and look at your workflow associations, and update the accounts that are associated with them. 

Each content type, list and web has a property called WorkflowAssociations where it stores this information.  It’s just a collection so you can enumerate through each one, but as you can imagine, this may take some time to walk through an entire web application so plan accordingly.  A specific workflow association is really just a chunk of Xml so it’s probably best to retrieve the AssociationData property and take a look at the Xml to get familiar with it.  As you review it, you should notice nodes for person, account ID and display name – those are going to be the values that you want to change.  After you change the Xml then you can just push it back into the AssociationData property and call the UpdateWorkflowAssociation method on the workflow association.

Thanks again to Israel for calling this problem out and to Raju for sharing his solution.